Oct 08 18:10:43 crc systemd[1]: Starting Kubernetes Kubelet... Oct 08 18:10:43 crc restorecon[4734]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 18:10:43 crc restorecon[4734]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 08 18:10:44 crc kubenswrapper[4750]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 18:10:44 crc kubenswrapper[4750]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 08 18:10:44 crc kubenswrapper[4750]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 18:10:44 crc kubenswrapper[4750]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 18:10:44 crc kubenswrapper[4750]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 08 18:10:44 crc kubenswrapper[4750]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.502725 4750 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508902 4750 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508923 4750 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508929 4750 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508935 4750 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508941 4750 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508948 4750 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508953 4750 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508958 4750 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508962 4750 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508967 4750 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508971 4750 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508976 4750 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508980 4750 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508985 4750 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508989 4750 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508993 4750 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.508997 4750 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509002 4750 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509008 4750 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509014 4750 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509019 4750 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509026 4750 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509031 4750 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509036 4750 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509041 4750 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509045 4750 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509051 4750 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509055 4750 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509061 4750 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509073 4750 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509078 4750 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509083 4750 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509088 4750 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509092 4750 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509096 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509100 4750 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509105 4750 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509110 4750 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509115 4750 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509119 4750 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509124 4750 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509129 4750 feature_gate.go:330] unrecognized feature gate: Example Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509133 4750 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509138 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509142 4750 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509147 4750 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509152 4750 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509157 4750 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509162 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509166 4750 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509171 4750 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509175 4750 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509180 4750 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509184 4750 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509189 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509193 4750 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509199 4750 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509204 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509209 4750 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509213 4750 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509217 4750 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509222 4750 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509227 4750 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509231 4750 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509235 4750 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509239 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509244 4750 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509248 4750 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509253 4750 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509257 4750 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.509263 4750 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509359 4750 flags.go:64] FLAG: --address="0.0.0.0" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509370 4750 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509388 4750 flags.go:64] FLAG: --anonymous-auth="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509395 4750 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509402 4750 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509407 4750 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509414 4750 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509421 4750 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509426 4750 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509431 4750 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509436 4750 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509441 4750 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509447 4750 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509452 4750 flags.go:64] FLAG: --cgroup-root="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509458 4750 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509463 4750 flags.go:64] FLAG: --client-ca-file="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509467 4750 flags.go:64] FLAG: --cloud-config="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509473 4750 flags.go:64] FLAG: --cloud-provider="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509478 4750 flags.go:64] FLAG: --cluster-dns="[]" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509490 4750 flags.go:64] FLAG: --cluster-domain="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509497 4750 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509502 4750 flags.go:64] FLAG: --config-dir="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509508 4750 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509514 4750 flags.go:64] FLAG: --container-log-max-files="5" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509521 4750 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509526 4750 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509531 4750 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509536 4750 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509542 4750 flags.go:64] FLAG: --contention-profiling="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509550 4750 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509555 4750 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509561 4750 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509566 4750 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509588 4750 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509593 4750 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509599 4750 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509604 4750 flags.go:64] FLAG: --enable-load-reader="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509610 4750 flags.go:64] FLAG: --enable-server="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509615 4750 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509642 4750 flags.go:64] FLAG: --event-burst="100" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509650 4750 flags.go:64] FLAG: --event-qps="50" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509655 4750 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509660 4750 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509666 4750 flags.go:64] FLAG: --eviction-hard="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509673 4750 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509678 4750 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509683 4750 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509688 4750 flags.go:64] FLAG: --eviction-soft="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509693 4750 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509698 4750 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509703 4750 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509708 4750 flags.go:64] FLAG: --experimental-mounter-path="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509713 4750 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509718 4750 flags.go:64] FLAG: --fail-swap-on="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509723 4750 flags.go:64] FLAG: --feature-gates="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509732 4750 flags.go:64] FLAG: --file-check-frequency="20s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509737 4750 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509742 4750 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509747 4750 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509753 4750 flags.go:64] FLAG: --healthz-port="10248" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509758 4750 flags.go:64] FLAG: --help="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509763 4750 flags.go:64] FLAG: --hostname-override="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509768 4750 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509773 4750 flags.go:64] FLAG: --http-check-frequency="20s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509778 4750 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509783 4750 flags.go:64] FLAG: --image-credential-provider-config="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509788 4750 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509795 4750 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509800 4750 flags.go:64] FLAG: --image-service-endpoint="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509805 4750 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509810 4750 flags.go:64] FLAG: --kube-api-burst="100" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509815 4750 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509820 4750 flags.go:64] FLAG: --kube-api-qps="50" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509826 4750 flags.go:64] FLAG: --kube-reserved="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509831 4750 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509836 4750 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509841 4750 flags.go:64] FLAG: --kubelet-cgroups="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509845 4750 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509849 4750 flags.go:64] FLAG: --lock-file="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509853 4750 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509857 4750 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509863 4750 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509869 4750 flags.go:64] FLAG: --log-json-split-stream="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509873 4750 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509877 4750 flags.go:64] FLAG: --log-text-split-stream="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509881 4750 flags.go:64] FLAG: --logging-format="text" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509885 4750 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509890 4750 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509894 4750 flags.go:64] FLAG: --manifest-url="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509897 4750 flags.go:64] FLAG: --manifest-url-header="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509903 4750 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509907 4750 flags.go:64] FLAG: --max-open-files="1000000" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509912 4750 flags.go:64] FLAG: --max-pods="110" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509916 4750 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509920 4750 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509924 4750 flags.go:64] FLAG: --memory-manager-policy="None" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509929 4750 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509933 4750 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509937 4750 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509941 4750 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509950 4750 flags.go:64] FLAG: --node-status-max-images="50" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509955 4750 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509959 4750 flags.go:64] FLAG: --oom-score-adj="-999" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509963 4750 flags.go:64] FLAG: --pod-cidr="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509967 4750 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509974 4750 flags.go:64] FLAG: --pod-manifest-path="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509978 4750 flags.go:64] FLAG: --pod-max-pids="-1" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509982 4750 flags.go:64] FLAG: --pods-per-core="0" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509986 4750 flags.go:64] FLAG: --port="10250" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509990 4750 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509994 4750 flags.go:64] FLAG: --provider-id="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.509998 4750 flags.go:64] FLAG: --qos-reserved="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510003 4750 flags.go:64] FLAG: --read-only-port="10255" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510007 4750 flags.go:64] FLAG: --register-node="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510011 4750 flags.go:64] FLAG: --register-schedulable="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510016 4750 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510023 4750 flags.go:64] FLAG: --registry-burst="10" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510027 4750 flags.go:64] FLAG: --registry-qps="5" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510031 4750 flags.go:64] FLAG: --reserved-cpus="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510035 4750 flags.go:64] FLAG: --reserved-memory="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510040 4750 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510044 4750 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510049 4750 flags.go:64] FLAG: --rotate-certificates="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510053 4750 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510058 4750 flags.go:64] FLAG: --runonce="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510062 4750 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510067 4750 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510072 4750 flags.go:64] FLAG: --seccomp-default="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510076 4750 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510080 4750 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510085 4750 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510089 4750 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510093 4750 flags.go:64] FLAG: --storage-driver-password="root" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510097 4750 flags.go:64] FLAG: --storage-driver-secure="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510101 4750 flags.go:64] FLAG: --storage-driver-table="stats" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510105 4750 flags.go:64] FLAG: --storage-driver-user="root" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510109 4750 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510113 4750 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510117 4750 flags.go:64] FLAG: --system-cgroups="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510121 4750 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510127 4750 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510131 4750 flags.go:64] FLAG: --tls-cert-file="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510135 4750 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510144 4750 flags.go:64] FLAG: --tls-min-version="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510148 4750 flags.go:64] FLAG: --tls-private-key-file="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510155 4750 flags.go:64] FLAG: --topology-manager-policy="none" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510159 4750 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510164 4750 flags.go:64] FLAG: --topology-manager-scope="container" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510169 4750 flags.go:64] FLAG: --v="2" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510174 4750 flags.go:64] FLAG: --version="false" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510179 4750 flags.go:64] FLAG: --vmodule="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510184 4750 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510189 4750 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510331 4750 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510338 4750 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510344 4750 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510348 4750 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510353 4750 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510357 4750 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510361 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510364 4750 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510368 4750 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510371 4750 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510375 4750 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510379 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510382 4750 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510386 4750 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510390 4750 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510394 4750 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510399 4750 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510403 4750 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510407 4750 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510412 4750 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510416 4750 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510420 4750 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510425 4750 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510429 4750 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510436 4750 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510441 4750 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510446 4750 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510450 4750 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510453 4750 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510458 4750 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510463 4750 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510469 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510475 4750 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510480 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510486 4750 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510491 4750 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510497 4750 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510501 4750 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510506 4750 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510510 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510513 4750 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510517 4750 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510520 4750 feature_gate.go:330] unrecognized feature gate: Example Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510524 4750 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510527 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510531 4750 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510534 4750 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510538 4750 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510541 4750 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510545 4750 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510551 4750 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510555 4750 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510559 4750 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510563 4750 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510567 4750 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510584 4750 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510588 4750 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510593 4750 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510596 4750 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510600 4750 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510603 4750 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510607 4750 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510610 4750 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510614 4750 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510618 4750 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510621 4750 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510625 4750 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510629 4750 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510633 4750 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510636 4750 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.510640 4750 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.510652 4750 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.521804 4750 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.521846 4750 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.521951 4750 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.521968 4750 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.521974 4750 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.521980 4750 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.521986 4750 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.521993 4750 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.521999 4750 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522004 4750 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522011 4750 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522020 4750 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522026 4750 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522031 4750 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522037 4750 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522042 4750 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522048 4750 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522053 4750 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522058 4750 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522063 4750 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522068 4750 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522074 4750 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522079 4750 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522085 4750 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522090 4750 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522095 4750 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522100 4750 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522106 4750 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522112 4750 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522118 4750 feature_gate.go:330] unrecognized feature gate: Example Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522124 4750 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522130 4750 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522135 4750 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522140 4750 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522146 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522151 4750 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522157 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522163 4750 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522168 4750 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522174 4750 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522179 4750 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522185 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522190 4750 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522216 4750 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522223 4750 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522230 4750 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522235 4750 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522241 4750 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522246 4750 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522251 4750 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522257 4750 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522262 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522267 4750 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522272 4750 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522277 4750 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522283 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522288 4750 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522293 4750 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522299 4750 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522307 4750 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522313 4750 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522321 4750 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522328 4750 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522334 4750 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522340 4750 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522345 4750 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522350 4750 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522355 4750 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522361 4750 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522368 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522374 4750 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522379 4750 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522385 4750 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.522395 4750 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522590 4750 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522602 4750 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522607 4750 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522614 4750 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522619 4750 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522625 4750 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522630 4750 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522635 4750 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522641 4750 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522650 4750 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522656 4750 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522661 4750 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522666 4750 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522672 4750 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522677 4750 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522683 4750 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522688 4750 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522694 4750 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522699 4750 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522705 4750 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522710 4750 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522715 4750 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522721 4750 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522726 4750 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522731 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522736 4750 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522742 4750 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522747 4750 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522752 4750 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522757 4750 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522762 4750 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522768 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522773 4750 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522778 4750 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522788 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522793 4750 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522798 4750 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522804 4750 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522810 4750 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522844 4750 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522855 4750 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522864 4750 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522874 4750 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522882 4750 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522890 4750 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522897 4750 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522904 4750 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522911 4750 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522917 4750 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522924 4750 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522938 4750 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522972 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522980 4750 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522987 4750 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.522994 4750 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523000 4750 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523006 4750 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523013 4750 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523019 4750 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523026 4750 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523031 4750 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523037 4750 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523042 4750 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523047 4750 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523053 4750 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523058 4750 feature_gate.go:330] unrecognized feature gate: Example Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523063 4750 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523070 4750 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523077 4750 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523086 4750 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.523098 4750 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.523108 4750 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.524250 4750 server.go:940] "Client rotation is on, will bootstrap in background" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.528941 4750 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.530063 4750 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.531857 4750 server.go:997] "Starting client certificate rotation" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.531898 4750 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.532858 4750 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-22 05:16:11.631315768 +0000 UTC Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.532957 4750 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1067h5m27.098362477s for next certificate rotation Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.567336 4750 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.569879 4750 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.593083 4750 log.go:25] "Validated CRI v1 runtime API" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.628073 4750 log.go:25] "Validated CRI v1 image API" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.630334 4750 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.635309 4750 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-08-17-06-00-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.635343 4750 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.651230 4750 manager.go:217] Machine: {Timestamp:2025-10-08 18:10:44.648525885 +0000 UTC m=+0.561496918 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4099aa14-e8f7-4bff-9f81-40284b959bbe BootID:bfc00a00-da5b-4621-a04e-e20b47fefa95 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:8b:e4:b6 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:8b:e4:b6 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c9:b2:5c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:50:97:a6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f5:cd:85 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f5:69:8c Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:a0:46:1d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9a:23:6e:c2:f8:7d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:aa:5e:17:39:1b:82 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.651413 4750 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.651507 4750 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.654102 4750 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.654269 4750 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.654300 4750 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.654503 4750 topology_manager.go:138] "Creating topology manager with none policy" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.654512 4750 container_manager_linux.go:303] "Creating device plugin manager" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.655025 4750 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.655050 4750 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.655176 4750 state_mem.go:36] "Initialized new in-memory state store" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.655241 4750 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.658265 4750 kubelet.go:418] "Attempting to sync node with API server" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.658282 4750 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.658301 4750 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.658311 4750 kubelet.go:324] "Adding apiserver pod source" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.658322 4750 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.661811 4750 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.662648 4750 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.664256 4750 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.665139 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.665231 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.665188 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.665330 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665557 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665592 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665600 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665605 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665616 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665624 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665632 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665644 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665654 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665664 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665672 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.665679 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.667737 4750 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.668064 4750 server.go:1280] "Started kubelet" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.669583 4750 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.669558 4750 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 08 18:10:44 crc systemd[1]: Started Kubernetes Kubelet. Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.674223 4750 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.675470 4750 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.675519 4750 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.677287 4750 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 17:45:10.771268183 +0000 UTC Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.677340 4750 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1631h34m26.093931959s for next certificate rotation Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.677485 4750 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.679287 4750 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.678762 4750 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.682165 4750 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.683672 4750 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.684995 4750 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.685013 4750 factory.go:55] Registering systemd factory Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.685021 4750 factory.go:221] Registration of the systemd container factory successfully Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.684929 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.685125 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.679714 4750 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.683616 4750 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c968334efe457 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-08 18:10:44.668040279 +0000 UTC m=+0.581011292,LastTimestamp:2025-10-08 18:10:44.668040279 +0000 UTC m=+0.581011292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.688602 4750 factory.go:153] Registering CRI-O factory Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.688686 4750 factory.go:221] Registration of the crio container factory successfully Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.688745 4750 factory.go:103] Registering Raw factory Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.688769 4750 manager.go:1196] Started watching for new ooms in manager Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.689044 4750 server.go:460] "Adding debug handlers to kubelet server" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.689951 4750 manager.go:319] Starting recovery of all containers Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700021 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700139 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700152 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700162 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700173 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700182 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700198 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700208 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700221 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700232 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700243 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700255 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700265 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700277 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700288 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700300 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700312 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700327 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700340 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700352 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700362 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700374 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700387 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700401 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700410 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700420 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700434 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700445 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700455 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700466 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700476 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700487 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700550 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700567 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700589 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700601 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700612 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700623 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700632 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700643 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700652 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700663 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700675 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700687 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700698 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700732 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700743 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700753 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700764 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700773 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700781 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700792 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700804 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700815 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700829 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700840 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700851 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700865 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700876 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700885 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700895 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700906 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700916 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700926 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700938 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700948 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700958 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700970 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700979 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700989 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.700998 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701008 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701016 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701026 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701037 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701046 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701056 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701066 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701076 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701086 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701096 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701105 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701116 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701126 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701137 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701146 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701154 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701164 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701174 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701184 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701193 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701203 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701213 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701222 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701249 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701259 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701268 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701278 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701286 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701296 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701306 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701318 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701328 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701337 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701352 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701363 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701373 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701385 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701393 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701402 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701412 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701422 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701431 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701442 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701451 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701461 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701472 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701480 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701490 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701501 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701512 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701522 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701533 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701541 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701553 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701561 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701583 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701594 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701604 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701613 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701622 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701632 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701641 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701651 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701660 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701668 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701677 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701688 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701698 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701708 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701748 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701759 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701770 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701778 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701788 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701798 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701806 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701815 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701826 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701835 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701844 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701854 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701864 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701873 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701884 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701893 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701901 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701910 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701920 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701929 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.701974 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.706399 4750 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709651 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709672 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709683 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709726 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709742 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709753 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709772 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709783 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709799 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709812 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709824 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709840 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709852 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709868 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709880 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709895 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709913 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709925 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709942 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709954 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709965 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709981 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.709995 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710013 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710027 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710038 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710053 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710064 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710079 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710091 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710104 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710123 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710136 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710152 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710163 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710177 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710194 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710206 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710218 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710234 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710247 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710263 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710275 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710288 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710303 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710314 4750 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710324 4750 reconstruct.go:97] "Volume reconstruction finished" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.710332 4750 reconciler.go:26] "Reconciler: start to sync state" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.720064 4750 manager.go:324] Recovery completed Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.730063 4750 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.732428 4750 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.732501 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.732887 4750 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.732929 4750 kubelet.go:2335] "Starting kubelet main sync loop" Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.733053 4750 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 08 18:10:44 crc kubenswrapper[4750]: W1008 18:10:44.734150 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.734213 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.734227 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.734282 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.734300 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.735571 4750 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.735587 4750 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.735604 4750 state_mem.go:36] "Initialized new in-memory state store" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.758218 4750 policy_none.go:49] "None policy: Start" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.758962 4750 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.758984 4750 state_mem.go:35] "Initializing new in-memory state store" Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.782514 4750 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.831321 4750 manager.go:334] "Starting Device Plugin manager" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.831385 4750 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.831399 4750 server.go:79] "Starting device plugin registration server" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.831858 4750 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.831874 4750 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.832342 4750 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.832436 4750 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.832445 4750 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.833607 4750 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.833715 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.834810 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.834858 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.834871 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.835042 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.835260 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.835319 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.835862 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.835900 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.835922 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.835962 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.835979 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.836009 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.836063 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.836287 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.836420 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.836924 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.836952 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.836963 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837068 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837088 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837106 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837115 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837304 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837362 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837788 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837809 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837822 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837918 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.837968 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838011 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838121 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838161 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838178 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838634 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838666 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838683 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.838899 4750 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838920 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.838962 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.839003 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.839022 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.839033 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.839880 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.839905 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.839918 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.884466 4750 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.912916 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.912951 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.912970 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.912986 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913004 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913018 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913033 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913048 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913106 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913140 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913159 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913181 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913196 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913216 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.913231 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.933691 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.935187 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.935244 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.935255 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:44 crc kubenswrapper[4750]: I1008 18:10:44.935282 4750 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 18:10:44 crc kubenswrapper[4750]: E1008 18:10:44.935735 4750 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.014943 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.014999 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015024 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015040 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015057 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015074 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015091 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015107 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015128 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015144 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015160 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015155 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015192 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015210 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015161 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015218 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015258 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015203 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015245 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015282 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015297 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015243 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015177 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015449 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015234 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015477 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015507 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015528 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015545 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.015510 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.136775 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.139921 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.139976 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.139999 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.140037 4750 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 18:10:45 crc kubenswrapper[4750]: E1008 18:10:45.140592 4750 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.174997 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.181579 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.198613 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.215112 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.221169 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 18:10:45 crc kubenswrapper[4750]: W1008 18:10:45.246266 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-62342709ecf3932cf3ba9f4d534229fa5efdece1ccd0bf0cad2c3ea1873c32f0 WatchSource:0}: Error finding container 62342709ecf3932cf3ba9f4d534229fa5efdece1ccd0bf0cad2c3ea1873c32f0: Status 404 returned error can't find the container with id 62342709ecf3932cf3ba9f4d534229fa5efdece1ccd0bf0cad2c3ea1873c32f0 Oct 08 18:10:45 crc kubenswrapper[4750]: W1008 18:10:45.250325 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2778a731230d491e20c9e8d0c1299d980c20d92165e8517282549f4e5a1babeb WatchSource:0}: Error finding container 2778a731230d491e20c9e8d0c1299d980c20d92165e8517282549f4e5a1babeb: Status 404 returned error can't find the container with id 2778a731230d491e20c9e8d0c1299d980c20d92165e8517282549f4e5a1babeb Oct 08 18:10:45 crc kubenswrapper[4750]: W1008 18:10:45.251164 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-bafd85848933c5a73933ee59eeb90ae2b330e24f167b41c2f5d614bc017fe8cc WatchSource:0}: Error finding container bafd85848933c5a73933ee59eeb90ae2b330e24f167b41c2f5d614bc017fe8cc: Status 404 returned error can't find the container with id bafd85848933c5a73933ee59eeb90ae2b330e24f167b41c2f5d614bc017fe8cc Oct 08 18:10:45 crc kubenswrapper[4750]: E1008 18:10:45.285672 4750 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Oct 08 18:10:45 crc kubenswrapper[4750]: W1008 18:10:45.481793 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:45 crc kubenswrapper[4750]: E1008 18:10:45.481873 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.541642 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.542910 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.542943 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.542952 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.542972 4750 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 18:10:45 crc kubenswrapper[4750]: E1008 18:10:45.543310 4750 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 08 18:10:45 crc kubenswrapper[4750]: W1008 18:10:45.600947 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:45 crc kubenswrapper[4750]: E1008 18:10:45.601036 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.680793 4750 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.737451 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2778a731230d491e20c9e8d0c1299d980c20d92165e8517282549f4e5a1babeb"} Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.739599 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"62342709ecf3932cf3ba9f4d534229fa5efdece1ccd0bf0cad2c3ea1873c32f0"} Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.740721 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2bfa499db3a38bdfff021e2a71fcf2ccbf93c43fb6cedb84739aa1109cd1b8f8"} Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.742115 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"314ba53b46e02f58e46aab89076a6d256826fea0f3ef8989f96a802b8e5c24b6"} Oct 08 18:10:45 crc kubenswrapper[4750]: I1008 18:10:45.744610 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bafd85848933c5a73933ee59eeb90ae2b330e24f167b41c2f5d614bc017fe8cc"} Oct 08 18:10:45 crc kubenswrapper[4750]: W1008 18:10:45.892690 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:45 crc kubenswrapper[4750]: E1008 18:10:45.892761 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 08 18:10:46 crc kubenswrapper[4750]: E1008 18:10:46.086939 4750 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Oct 08 18:10:46 crc kubenswrapper[4750]: W1008 18:10:46.231074 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:46 crc kubenswrapper[4750]: E1008 18:10:46.231192 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.343693 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.345632 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.345660 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.345668 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.345687 4750 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 18:10:46 crc kubenswrapper[4750]: E1008 18:10:46.346031 4750 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.680666 4750 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.748691 4750 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8" exitCode=0 Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.748751 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8"} Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.748790 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.749617 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.749719 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.749809 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.751694 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd"} Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.751738 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72"} Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.751753 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0"} Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.751763 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5"} Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.751764 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.752582 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.752610 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.752623 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.753141 4750 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c" exitCode=0 Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.753175 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c"} Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.753219 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.754034 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.754061 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.754072 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.755035 4750 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f0bf914d5d70f437898fee03f83bdc0dba4a47ecaa40f7f131d5de15438a824e" exitCode=0 Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.755082 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f0bf914d5d70f437898fee03f83bdc0dba4a47ecaa40f7f131d5de15438a824e"} Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.755290 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.755462 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.756773 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.756816 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.756832 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.757001 4750 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59" exitCode=0 Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.757043 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59"} Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.757105 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.758655 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.758736 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.758789 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.758989 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.759007 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:46 crc kubenswrapper[4750]: I1008 18:10:46.759015 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.680605 4750 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 08 18:10:47 crc kubenswrapper[4750]: E1008 18:10:47.688173 4750 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.760634 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b18ff0bd56d5ebbea77434a1246b6a7cbab6b4b2a5ccd49e1709968e8648df03"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.760734 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.761603 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.761650 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.761663 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.762669 4750 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96" exitCode=0 Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.762721 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.762765 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.763674 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.763700 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.763710 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.764717 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.764752 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.764752 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.764869 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.768101 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.768132 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.768142 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.770470 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.770978 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.771347 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.771377 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.771389 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.771401 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.771411 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1"} Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.771729 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.771751 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.771797 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.772332 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.772354 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.772365 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.946192 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.947664 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.947689 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.947697 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:47 crc kubenswrapper[4750]: I1008 18:10:47.947719 4750 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 18:10:47 crc kubenswrapper[4750]: E1008 18:10:47.947915 4750 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.775347 4750 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b" exitCode=0 Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.775460 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.775487 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.775537 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.775591 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.775651 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.775613 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b"} Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.775660 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.777150 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.777188 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.777199 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778057 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778094 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778116 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778193 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778271 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778299 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778658 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778748 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:48 crc kubenswrapper[4750]: I1008 18:10:48.778761 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.781867 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48"} Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.781911 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832"} Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.781927 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8"} Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.781938 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c"} Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.781948 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f"} Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.782026 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.782954 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.782991 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:49 crc kubenswrapper[4750]: I1008 18:10:49.783003 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.052329 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.052502 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.052544 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.053693 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.053740 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.053754 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.533375 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.784591 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.785525 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.785594 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:50 crc kubenswrapper[4750]: I1008 18:10:50.785607 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.148061 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.149321 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.149369 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.149385 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.149414 4750 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.408202 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.408371 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.409350 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.409382 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.409392 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.769738 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.769944 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.769997 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.771154 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.771184 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.771194 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.786593 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.787419 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.787463 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.787477 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.912114 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.912330 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.913426 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.913464 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:51 crc kubenswrapper[4750]: I1008 18:10:51.913474 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:52 crc kubenswrapper[4750]: I1008 18:10:52.330437 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:52 crc kubenswrapper[4750]: I1008 18:10:52.330647 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:52 crc kubenswrapper[4750]: I1008 18:10:52.331744 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:52 crc kubenswrapper[4750]: I1008 18:10:52.331776 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:52 crc kubenswrapper[4750]: I1008 18:10:52.331785 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:54 crc kubenswrapper[4750]: I1008 18:10:54.740932 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:54 crc kubenswrapper[4750]: I1008 18:10:54.741106 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:54 crc kubenswrapper[4750]: I1008 18:10:54.742218 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:54 crc kubenswrapper[4750]: I1008 18:10:54.742271 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:54 crc kubenswrapper[4750]: I1008 18:10:54.742280 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:54 crc kubenswrapper[4750]: E1008 18:10:54.839001 4750 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.330731 4750 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.330815 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.449765 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.449932 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.451435 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.451471 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.451483 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.457702 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.795158 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.796405 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.796440 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.796452 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.799019 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:55 crc kubenswrapper[4750]: I1008 18:10:55.805980 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:10:56 crc kubenswrapper[4750]: I1008 18:10:56.797527 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:56 crc kubenswrapper[4750]: I1008 18:10:56.798664 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:56 crc kubenswrapper[4750]: I1008 18:10:56.798695 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:56 crc kubenswrapper[4750]: I1008 18:10:56.798706 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.799036 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.799945 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.799962 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.799970 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.822666 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.822752 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.823678 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.823764 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:10:57 crc kubenswrapper[4750]: I1008 18:10:57.823818 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:10:58 crc kubenswrapper[4750]: W1008 18:10:58.111107 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.111211 4750 trace.go:236] Trace[1555441844]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 18:10:48.108) (total time: 10002ms): Oct 08 18:10:58 crc kubenswrapper[4750]: Trace[1555441844]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (18:10:58.111) Oct 08 18:10:58 crc kubenswrapper[4750]: Trace[1555441844]: [10.002781934s] [10.002781934s] END Oct 08 18:10:58 crc kubenswrapper[4750]: E1008 18:10:58.111236 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 08 18:10:58 crc kubenswrapper[4750]: W1008 18:10:58.153302 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.153376 4750 trace.go:236] Trace[693781086]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 18:10:48.152) (total time: 10001ms): Oct 08 18:10:58 crc kubenswrapper[4750]: Trace[693781086]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:10:58.153) Oct 08 18:10:58 crc kubenswrapper[4750]: Trace[693781086]: [10.001126031s] [10.001126031s] END Oct 08 18:10:58 crc kubenswrapper[4750]: E1008 18:10:58.153395 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 08 18:10:58 crc kubenswrapper[4750]: W1008 18:10:58.161880 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.161939 4750 trace.go:236] Trace[760085648]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 18:10:48.160) (total time: 10000ms): Oct 08 18:10:58 crc kubenswrapper[4750]: Trace[760085648]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (18:10:58.161) Oct 08 18:10:58 crc kubenswrapper[4750]: Trace[760085648]: [10.000959014s] [10.000959014s] END Oct 08 18:10:58 crc kubenswrapper[4750]: E1008 18:10:58.161954 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 08 18:10:58 crc kubenswrapper[4750]: W1008 18:10:58.419689 4750 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.419777 4750 trace.go:236] Trace[486432657]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 18:10:48.418) (total time: 10001ms): Oct 08 18:10:58 crc kubenswrapper[4750]: Trace[486432657]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:10:58.419) Oct 08 18:10:58 crc kubenswrapper[4750]: Trace[486432657]: [10.001705722s] [10.001705722s] END Oct 08 18:10:58 crc kubenswrapper[4750]: E1008 18:10:58.419799 4750 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.681940 4750 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.748092 4750 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.748410 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.766418 4750 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 18:10:58 crc kubenswrapper[4750]: I1008 18:10:58.766733 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.665217 4750 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.774857 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.775055 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.780136 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.780197 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.780217 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.780786 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.808147 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.809317 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.809364 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:01 crc kubenswrapper[4750]: I1008 18:11:01.809377 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:03 crc kubenswrapper[4750]: E1008 18:11:03.758159 4750 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.760735 4750 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 08 18:11:03 crc kubenswrapper[4750]: E1008 18:11:03.761771 4750 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.803261 4750 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54414->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.803313 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54414->192.168.126.11:17697: read: connection reset by peer" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.803660 4750 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.803722 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.804066 4750 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.804130 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.828518 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.828795 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.830109 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.830167 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.830178 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:03 crc kubenswrapper[4750]: I1008 18:11:03.833774 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.059565 4750 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.069141 4750 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.454422 4750 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.671210 4750 apiserver.go:52] "Watching apiserver" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.673432 4750 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.673707 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hdvcg","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.674071 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.674198 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.674228 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hdvcg" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.674169 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.674223 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.674127 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.674116 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.674856 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.674914 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.675020 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.677322 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.677490 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.677539 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.677664 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.677768 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.678336 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.678385 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.678672 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.678814 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.678864 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.679363 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.679784 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.688696 4750 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.695353 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.704045 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.704836 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2x8kt"] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.707103 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mzb5c"] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.709877 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.709977 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-grddb"] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.711144 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rl7f4"] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.711197 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.712097 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.716324 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.716528 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.716703 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.716767 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.716872 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.716973 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720296 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720473 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720576 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720595 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720714 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720766 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720782 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720896 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720870 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720951 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.720965 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.721191 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.721340 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.721507 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.721619 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.748606 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.759925 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766211 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766261 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766286 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766309 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766332 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766354 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766375 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766397 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766420 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766443 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766487 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766508 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766572 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766618 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766638 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766672 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766695 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766689 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766772 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766794 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766818 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766841 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766861 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766864 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766882 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766899 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766915 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766937 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766952 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.766970 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767025 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767042 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767059 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767082 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767101 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767118 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767134 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767150 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767183 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767219 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767235 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767252 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767248 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767298 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767295 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767316 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767334 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767349 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767365 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767380 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767389 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767396 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767456 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767476 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767494 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767518 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767520 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767539 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767582 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767600 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767616 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767633 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767648 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767668 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767685 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767733 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767749 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767764 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767778 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767789 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767799 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767838 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767865 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767888 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767921 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767945 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767966 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767986 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768001 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768026 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768045 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768061 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768078 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768093 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768108 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768125 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768144 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768161 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768183 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768200 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768216 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768248 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768265 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768281 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768296 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768311 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768326 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768346 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768373 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768388 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768403 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768417 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768433 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768479 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768495 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768511 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768528 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768544 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768575 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768592 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768608 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768636 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768651 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768666 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768768 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768786 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768802 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768819 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768835 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768851 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768866 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768880 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768896 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768912 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768928 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768943 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768959 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768980 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769011 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769030 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769130 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769147 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769163 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769178 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769212 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769228 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769254 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769269 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769286 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769301 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769317 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769335 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769352 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769367 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769382 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769397 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769525 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769546 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769807 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769824 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769842 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769859 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769875 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769921 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770045 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770063 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770079 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770095 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770111 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770126 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770141 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770157 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770307 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770375 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770403 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770419 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770440 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770459 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770475 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770516 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770532 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770567 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770585 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770617 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770645 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770674 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770707 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781353 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781506 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781644 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781685 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781758 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781783 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781808 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781856 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781880 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781905 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781961 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781994 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.782016 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.782121 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.797602 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767789 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767230 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767879 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767946 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767969 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.767996 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768111 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768116 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768151 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768191 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768587 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.768772 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769030 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.802690 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769302 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769380 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769530 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.769796 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.770739 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.773021 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.773180 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.773428 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.773998 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.774154 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.774272 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.774763 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.775127 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.775823 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.802823 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.775991 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.776503 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.776929 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.777336 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.777495 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.777701 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.777772 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.777828 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.777950 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.777961 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.778366 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.778421 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.778429 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.778722 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.778848 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.778945 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.778963 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.779072 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.779215 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.779194 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.779459 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.803018 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.779592 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.803041 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.779790 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.804365 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.779805 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.779882 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780008 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780060 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780138 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780151 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780244 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780269 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780353 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780414 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780657 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780623 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780691 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.780924 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.781159 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.782482 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.782512 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.782775 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.783056 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.783080 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.783262 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.783649 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.783721 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.784879 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.785443 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.785849 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.786195 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.786495 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.786512 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.786529 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.786700 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.786748 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.786854 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.787026 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.787111 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.787170 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.789856 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.790031 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.790625 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.809117 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.790726 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.790897 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.791158 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.791424 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.791618 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.791618 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.791857 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.791891 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.792109 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.792193 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:11:05.283727948 +0000 UTC m=+21.196698961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810037 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810078 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810098 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810116 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810135 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810163 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810192 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810211 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810237 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810260 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810297 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810322 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810409 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810456 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810682 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810716 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.792538 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.792520 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810738 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810767 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810827 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-kubelet\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810849 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-conf-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810866 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovn-node-metrics-cert\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810885 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31a78d19-68f8-46b7-9c53-2bbda2930e49-hosts-file\") pod \"node-resolver-hdvcg\" (UID: \"31a78d19-68f8-46b7-9c53-2bbda2930e49\") " pod="openshift-dns/node-resolver-hdvcg" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810902 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84vv9\" (UniqueName: \"kubernetes.io/projected/31a78d19-68f8-46b7-9c53-2bbda2930e49-kube-api-access-84vv9\") pod \"node-resolver-hdvcg\" (UID: \"31a78d19-68f8-46b7-9c53-2bbda2930e49\") " pod="openshift-dns/node-resolver-hdvcg" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810926 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810946 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-os-release\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810966 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.810983 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-netns\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811002 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811019 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-os-release\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811036 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-cni-binary-copy\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811053 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811071 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-socket-dir-parent\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811088 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-proxy-tls\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811103 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-mcd-auth-proxy-config\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811140 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-node-log\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811158 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811177 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4hmk\" (UniqueName: \"kubernetes.io/projected/9745a747-29eb-473f-bdb1-b526e1fe1445-kube-api-access-q4hmk\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811194 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-system-cni-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811209 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-rootfs\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811231 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-bin\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811245 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-netns\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811262 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-daemon-config\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811276 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-slash\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811290 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-config\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811309 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-etc-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811329 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811347 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811369 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9745a747-29eb-473f-bdb1-b526e1fe1445-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811388 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-multus-certs\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811406 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-system-cni-dir\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811423 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-systemd-units\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811439 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-ovn-kubernetes\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811456 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-hostroot\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811470 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-log-socket\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811488 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811510 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811531 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-cnibin\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811567 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9745a747-29eb-473f-bdb1-b526e1fe1445-cni-binary-copy\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811585 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-etc-kubernetes\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811697 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-var-lib-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811721 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-ovn\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811740 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811766 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811783 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-netd\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811799 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sxl9\" (UniqueName: \"kubernetes.io/projected/25d63a44-9fd7-4c19-8715-6ddec94d1806-kube-api-access-8sxl9\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811825 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.792575 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.792595 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.792637 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.794371 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.794661 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.795356 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.795985 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796059 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796454 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796479 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796827 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796930 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796998 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796924 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.797051 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.797304 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.797371 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796895 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.797051 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.797571 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.796539 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.797799 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.799301 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.799369 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.799481 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.799504 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.800804 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.800827 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.800837 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.800915 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.801105 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.801355 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.801442 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.801662 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.801789 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.801792 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.801840 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.802069 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.802189 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.802452 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.803244 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.803262 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.804880 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.805112 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.805225 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.805670 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.806025 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811019 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811034 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811283 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811473 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811801 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.811818 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812593 4750 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812783 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-cnibin\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812807 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drhq7\" (UniqueName: \"kubernetes.io/projected/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-kube-api-access-drhq7\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812844 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-kubelet\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812861 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-systemd\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812877 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-script-lib\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812902 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812919 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-k8s-cni-cncf-io\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812934 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-cni-multus\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812951 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-env-overrides\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812967 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.812987 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.813004 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.813034 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-cni-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.813064 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-cni-bin\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.813081 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9c8m\" (UniqueName: \"kubernetes.io/projected/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-kube-api-access-r9c8m\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.813096 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.813479 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.814189 4750 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.814979 4750 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.815230 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:05.315206671 +0000 UTC m=+21.228177684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.815260 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:05.315243992 +0000 UTC m=+21.228215005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815497 4750 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815513 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815524 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815536 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815546 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815572 4750 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815581 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815590 4750 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815602 4750 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815615 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815625 4750 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815635 4750 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815645 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815654 4750 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815663 4750 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815672 4750 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815681 4750 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815690 4750 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815698 4750 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815708 4750 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815717 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815725 4750 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815737 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815746 4750 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815754 4750 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815763 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815772 4750 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815780 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815789 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815798 4750 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815807 4750 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815816 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815825 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815834 4750 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815844 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815854 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815870 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815880 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815888 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815897 4750 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815905 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815914 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815923 4750 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815931 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815940 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815950 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815959 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815968 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815976 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815985 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.815994 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.816002 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.816011 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.816819 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.819156 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.819343 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.819390 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.819844 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.819875 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.822298 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.822372 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.822607 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.822640 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.824752 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.824908 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.825391 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.828324 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.828387 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.828615 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.828662 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.828683 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.828696 4750 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.828752 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:05.328733463 +0000 UTC m=+21.241704546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.829109 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.829584 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.830256 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.830352 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.830357 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.830415 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.831434 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.831718 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.832166 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.833976 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.834004 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.834021 4750 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:04 crc kubenswrapper[4750]: E1008 18:11:04.834073 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:05.334052833 +0000 UTC m=+21.247023906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.834429 4750 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384" exitCode=255 Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.834460 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384"} Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.837029 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.838004 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.838134 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.838210 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.842792 4750 scope.go:117] "RemoveContainer" containerID="a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.843771 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.843811 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.846260 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.847765 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.847916 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.870582 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.891885 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.914866 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.916875 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-script-lib\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.916904 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-cnibin\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.916922 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drhq7\" (UniqueName: \"kubernetes.io/projected/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-kube-api-access-drhq7\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.916937 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-kubelet\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.916951 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-systemd\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917003 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-k8s-cni-cncf-io\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917023 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-cni-multus\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917038 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-env-overrides\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917053 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917069 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917092 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917119 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-cni-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917096 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-cnibin\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917177 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-cni-bin\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917134 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-cni-bin\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917233 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9c8m\" (UniqueName: \"kubernetes.io/projected/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-kube-api-access-r9c8m\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917252 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-kubelet\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917291 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-conf-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917310 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovn-node-metrics-cert\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917322 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-kubelet\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917326 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31a78d19-68f8-46b7-9c53-2bbda2930e49-hosts-file\") pod \"node-resolver-hdvcg\" (UID: \"31a78d19-68f8-46b7-9c53-2bbda2930e49\") " pod="openshift-dns/node-resolver-hdvcg" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917372 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31a78d19-68f8-46b7-9c53-2bbda2930e49-hosts-file\") pod \"node-resolver-hdvcg\" (UID: \"31a78d19-68f8-46b7-9c53-2bbda2930e49\") " pod="openshift-dns/node-resolver-hdvcg" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917373 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84vv9\" (UniqueName: \"kubernetes.io/projected/31a78d19-68f8-46b7-9c53-2bbda2930e49-kube-api-access-84vv9\") pod \"node-resolver-hdvcg\" (UID: \"31a78d19-68f8-46b7-9c53-2bbda2930e49\") " pod="openshift-dns/node-resolver-hdvcg" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917414 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-os-release\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917450 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-netns\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917467 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-os-release\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917481 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-cni-binary-copy\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917496 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917517 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-k8s-cni-cncf-io\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917526 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-socket-dir-parent\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917541 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-cni-multus\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917543 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-proxy-tls\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917587 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-mcd-auth-proxy-config\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917603 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-node-log\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917619 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4hmk\" (UniqueName: \"kubernetes.io/projected/9745a747-29eb-473f-bdb1-b526e1fe1445-kube-api-access-q4hmk\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917621 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-netns\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917652 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-system-cni-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917683 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-system-cni-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917697 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-rootfs\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917739 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-bin\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917749 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-os-release\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917775 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-conf-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917757 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-netns\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917826 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-daemon-config\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917849 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-slash\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917870 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-config\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917890 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-multus-certs\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917910 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-etc-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917941 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-env-overrides\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917937 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9745a747-29eb-473f-bdb1-b526e1fe1445-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918012 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-var-lib-kubelet\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917802 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-netns\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918139 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918181 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918525 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-cni-binary-copy\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918575 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-daemon-config\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917735 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-script-lib\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.917501 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-systemd\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918706 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-rootfs\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918752 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-os-release\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918768 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-bin\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918782 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-slash\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918814 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-host-run-multus-certs\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918821 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918827 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-etc-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918844 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-system-cni-dir\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918861 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-systemd-units\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918894 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-ovn-kubernetes\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918913 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-hostroot\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.918992 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-ovn\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919015 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-log-socket\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919045 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-cni-dir\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919076 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-cnibin\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919092 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9745a747-29eb-473f-bdb1-b526e1fe1445-cni-binary-copy\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919108 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-etc-kubernetes\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919141 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-var-lib-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919159 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919182 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-hostroot\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919218 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-node-log\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919233 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919297 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-log-socket\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919337 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-multus-socket-dir-parent\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919355 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-system-cni-dir\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919371 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-systemd-units\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919386 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-ovn-kubernetes\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919421 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-ovn\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919451 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-etc-kubernetes\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919473 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9745a747-29eb-473f-bdb1-b526e1fe1445-cnibin\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.919602 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-config\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920013 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9745a747-29eb-473f-bdb1-b526e1fe1445-cni-binary-copy\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920071 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920102 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-var-lib-openvswitch\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920133 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-netd\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920151 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9745a747-29eb-473f-bdb1-b526e1fe1445-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920170 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sxl9\" (UniqueName: \"kubernetes.io/projected/25d63a44-9fd7-4c19-8715-6ddec94d1806-kube-api-access-8sxl9\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920188 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-netd\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920370 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920387 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920411 4750 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920423 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920434 4750 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920445 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920457 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920468 4750 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920480 4750 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920491 4750 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920503 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920514 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920526 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920538 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920568 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920582 4750 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920593 4750 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920605 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920617 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920628 4750 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920639 4750 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920650 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920662 4750 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920672 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920683 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920697 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-mcd-auth-proxy-config\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920694 4750 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920799 4750 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920810 4750 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920819 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920828 4750 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920843 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.920855 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921076 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921096 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921107 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921118 4750 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921138 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921148 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921166 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921175 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921189 4750 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921199 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921208 4750 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921221 4750 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921242 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921251 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921261 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921274 4750 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921283 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921291 4750 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921300 4750 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921313 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921322 4750 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921331 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921340 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921354 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921364 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921374 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921387 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921396 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921404 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921413 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921425 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921435 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921452 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921462 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921473 4750 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921484 4750 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921493 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921492 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovn-node-metrics-cert\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921503 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921515 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921525 4750 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921535 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921560 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921571 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921580 4750 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921589 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921601 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921611 4750 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921620 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921629 4750 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921642 4750 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921651 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921661 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921673 4750 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921682 4750 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921692 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921701 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921715 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921724 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921765 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921785 4750 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921820 4750 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921830 4750 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921851 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921863 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921874 4750 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921892 4750 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921905 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921917 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921927 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921940 4750 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921949 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921959 4750 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921968 4750 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921983 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.921993 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922003 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922004 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-proxy-tls\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922016 4750 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922025 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922036 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922057 4750 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922073 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922095 4750 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922109 4750 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922122 4750 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922134 4750 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922153 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922166 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922180 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922197 4750 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922209 4750 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922223 4750 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922236 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922254 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922269 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922282 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922295 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922313 4750 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922327 4750 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922340 4750 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922358 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922372 4750 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922384 4750 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922400 4750 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922423 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922437 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922449 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922462 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922480 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.922494 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.931937 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.938766 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drhq7\" (UniqueName: \"kubernetes.io/projected/cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444-kube-api-access-drhq7\") pod \"multus-mzb5c\" (UID: \"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\") " pod="openshift-multus/multus-mzb5c" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.939852 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9c8m\" (UniqueName: \"kubernetes.io/projected/f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4-kube-api-access-r9c8m\") pod \"machine-config-daemon-grddb\" (UID: \"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\") " pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.941419 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84vv9\" (UniqueName: \"kubernetes.io/projected/31a78d19-68f8-46b7-9c53-2bbda2930e49-kube-api-access-84vv9\") pod \"node-resolver-hdvcg\" (UID: \"31a78d19-68f8-46b7-9c53-2bbda2930e49\") " pod="openshift-dns/node-resolver-hdvcg" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.942813 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sxl9\" (UniqueName: \"kubernetes.io/projected/25d63a44-9fd7-4c19-8715-6ddec94d1806-kube-api-access-8sxl9\") pod \"ovnkube-node-rl7f4\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.944331 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4hmk\" (UniqueName: \"kubernetes.io/projected/9745a747-29eb-473f-bdb1-b526e1fe1445-kube-api-access-q4hmk\") pod \"multus-additional-cni-plugins-2x8kt\" (UID: \"9745a747-29eb-473f-bdb1-b526e1fe1445\") " pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.947967 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.956468 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.964863 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.973683 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.984438 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.993182 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:04 crc kubenswrapper[4750]: I1008 18:11:04.995623 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.001587 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.004045 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: W1008 18:11:05.004511 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-eba0f5c9ba81727f20161857aaf0837be17cf3a170417c81f6f8cb6b8c561c44 WatchSource:0}: Error finding container eba0f5c9ba81727f20161857aaf0837be17cf3a170417c81f6f8cb6b8c561c44: Status 404 returned error can't find the container with id eba0f5c9ba81727f20161857aaf0837be17cf3a170417c81f6f8cb6b8c561c44 Oct 08 18:11:05 crc kubenswrapper[4750]: W1008 18:11:05.013357 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-eb62229166dcf8ac1745f352cde0d4084a456518dd6ae1960e0ca38470907dcf WatchSource:0}: Error finding container eb62229166dcf8ac1745f352cde0d4084a456518dd6ae1960e0ca38470907dcf: Status 404 returned error can't find the container with id eb62229166dcf8ac1745f352cde0d4084a456518dd6ae1960e0ca38470907dcf Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.014510 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.020496 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hdvcg" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.026064 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.027457 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.036797 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mzb5c" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.042993 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.045724 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.052069 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.052302 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: W1008 18:11:05.054882 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c273c9ec10e1f9056b5f0b29e2fe5018dee79e78c56b2a51ed1d04f7f8a9a15d WatchSource:0}: Error finding container c273c9ec10e1f9056b5f0b29e2fe5018dee79e78c56b2a51ed1d04f7f8a9a15d: Status 404 returned error can't find the container with id c273c9ec10e1f9056b5f0b29e2fe5018dee79e78c56b2a51ed1d04f7f8a9a15d Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.058697 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.060540 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: W1008 18:11:05.070605 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9745a747_29eb_473f_bdb1_b526e1fe1445.slice/crio-f4e93f21e3afa645078ca4acb4c9a7d6c52dcd9f56294ff1a71140fe995343f0 WatchSource:0}: Error finding container f4e93f21e3afa645078ca4acb4c9a7d6c52dcd9f56294ff1a71140fe995343f0: Status 404 returned error can't find the container with id f4e93f21e3afa645078ca4acb4c9a7d6c52dcd9f56294ff1a71140fe995343f0 Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.072149 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.085625 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.095331 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.105737 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.119284 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.129675 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.138110 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.329039 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.329086 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:11:06.329066368 +0000 UTC m=+22.242037391 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.329738 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.329808 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.329836 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.329908 4750 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.329960 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:06.329948761 +0000 UTC m=+22.242919854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.329993 4750 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.330059 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.330074 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.330084 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:06.330052563 +0000 UTC m=+22.243023566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.330104 4750 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.330154 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:06.330141375 +0000 UTC m=+22.243112378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.430340 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.430508 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.430527 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.430539 4750 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.430616 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:06.430600389 +0000 UTC m=+22.343571412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.733144 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:05 crc kubenswrapper[4750]: E1008 18:11:05.733263 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.838482 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hdvcg" event={"ID":"31a78d19-68f8-46b7-9c53-2bbda2930e49","Type":"ContainerStarted","Data":"810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.838537 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hdvcg" event={"ID":"31a78d19-68f8-46b7-9c53-2bbda2930e49","Type":"ContainerStarted","Data":"3a73e4a1ed25474046065931e7d21d0643c484022098515c393f419d1c3a0d90"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.840427 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.842071 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.842299 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.843267 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzb5c" event={"ID":"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444","Type":"ContainerStarted","Data":"bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.843311 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzb5c" event={"ID":"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444","Type":"ContainerStarted","Data":"88c7762d6df1da2bee3357ea765a464a53d2ca3464997b6f0302b3f18832ff22"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.844235 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"eb62229166dcf8ac1745f352cde0d4084a456518dd6ae1960e0ca38470907dcf"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.845786 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.845816 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.845831 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"c23737879eeb1e6354b4a5660b1c17f57d892ce84055621276fefdca4f749479"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.847214 4750 generic.go:334] "Generic (PLEG): container finished" podID="9745a747-29eb-473f-bdb1-b526e1fe1445" containerID="5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca" exitCode=0 Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.847273 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" event={"ID":"9745a747-29eb-473f-bdb1-b526e1fe1445","Type":"ContainerDied","Data":"5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.847296 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" event={"ID":"9745a747-29eb-473f-bdb1-b526e1fe1445","Type":"ContainerStarted","Data":"f4e93f21e3afa645078ca4acb4c9a7d6c52dcd9f56294ff1a71140fe995343f0"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.849063 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.849098 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.849108 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c273c9ec10e1f9056b5f0b29e2fe5018dee79e78c56b2a51ed1d04f7f8a9a15d"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.850427 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.850510 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"eba0f5c9ba81727f20161857aaf0837be17cf3a170417c81f6f8cb6b8c561c44"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.851868 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d" exitCode=0 Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.851904 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.851940 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"f3fb66acce2cf8a97dd45af21d66399a5265db61cb8523c7af31e17c2fe8a342"} Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.863272 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.877318 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.889416 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.908097 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.921006 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.934169 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.945993 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.958063 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.973821 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.985426 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:05 crc kubenswrapper[4750]: I1008 18:11:05.998360 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:05Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.014017 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.028732 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.042091 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.053727 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.069657 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.091997 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.116260 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.129597 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.142116 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.144963 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6jln4"] Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.145496 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: W1008 18:11:06.146899 4750 reflector.go:561] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": failed to list *v1.Secret: secrets "node-ca-dockercfg-4777p" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.146947 4750 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4777p\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-ca-dockercfg-4777p\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.147968 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.148346 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.149287 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.151464 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.186781 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.208432 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.238079 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58294711-df24-43b8-b1b6-6617d55d49b3-host\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.238330 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78c7k\" (UniqueName: \"kubernetes.io/projected/58294711-df24-43b8-b1b6-6617d55d49b3-kube-api-access-78c7k\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.238473 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58294711-df24-43b8-b1b6-6617d55d49b3-serviceca\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.240922 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.261283 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.283267 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.295515 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.308498 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.320472 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.334320 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.338931 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.339051 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.339076 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58294711-df24-43b8-b1b6-6617d55d49b3-serviceca\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.339094 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58294711-df24-43b8-b1b6-6617d55d49b3-host\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.339112 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78c7k\" (UniqueName: \"kubernetes.io/projected/58294711-df24-43b8-b1b6-6617d55d49b3-kube-api-access-78c7k\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.339137 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.339178 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339220 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339249 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339252 4750 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339263 4750 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339332 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:08.339289124 +0000 UTC m=+24.252260137 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339346 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:08.339339895 +0000 UTC m=+24.252310908 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339388 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:11:08.339369536 +0000 UTC m=+24.252340619 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.339587 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58294711-df24-43b8-b1b6-6617d55d49b3-host\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339646 4750 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.339673 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:08.339665224 +0000 UTC m=+24.252636237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.340082 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58294711-df24-43b8-b1b6-6617d55d49b3-serviceca\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.347698 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.359909 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78c7k\" (UniqueName: \"kubernetes.io/projected/58294711-df24-43b8-b1b6-6617d55d49b3-kube-api-access-78c7k\") pod \"node-ca-6jln4\" (UID: \"58294711-df24-43b8-b1b6-6617d55d49b3\") " pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.364048 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.376207 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.390160 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.405362 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.422196 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.440271 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.440425 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.440441 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.440451 4750 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.440498 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:08.440485797 +0000 UTC m=+24.353456810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.460650 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.504866 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.545712 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.588464 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.733975 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.733994 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.734096 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:06 crc kubenswrapper[4750]: E1008 18:11:06.734149 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.738040 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.738815 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.739523 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.740213 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.741034 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.741563 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.742157 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.742757 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.743376 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.745692 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.746506 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.747621 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.748277 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.749250 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.749785 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.750663 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.751305 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.751782 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.752906 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.753592 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.754043 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.754972 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.755387 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.756595 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.757075 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.758047 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.758681 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.759194 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.760707 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.761334 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.762380 4750 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.762494 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.765224 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.766365 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.766907 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.768411 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.769067 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.770000 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.770635 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.771649 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.772134 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.773086 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.773739 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.775140 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.775751 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.776826 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.777457 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.778971 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.779647 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.780752 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.781395 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.782670 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.783320 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.783941 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.857627 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.857674 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.857688 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.857699 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.859145 4750 generic.go:334] "Generic (PLEG): container finished" podID="9745a747-29eb-473f-bdb1-b526e1fe1445" containerID="a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a" exitCode=0 Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.859216 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" event={"ID":"9745a747-29eb-473f-bdb1-b526e1fe1445","Type":"ContainerDied","Data":"a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a"} Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.870181 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.883809 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.894480 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.907634 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.921926 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.934878 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.950089 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.963207 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.974222 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:06 crc kubenswrapper[4750]: I1008 18:11:06.984648 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:06Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.023298 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.060996 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.101734 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.146732 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.399588 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.407569 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6jln4" Oct 08 18:11:07 crc kubenswrapper[4750]: W1008 18:11:07.418304 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58294711_df24_43b8_b1b6_6617d55d49b3.slice/crio-8cc193260bd71c2f5d99f6f5a1e19ac5d65de68c295610deb9aee20d1a8dc5cc WatchSource:0}: Error finding container 8cc193260bd71c2f5d99f6f5a1e19ac5d65de68c295610deb9aee20d1a8dc5cc: Status 404 returned error can't find the container with id 8cc193260bd71c2f5d99f6f5a1e19ac5d65de68c295610deb9aee20d1a8dc5cc Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.733797 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:07 crc kubenswrapper[4750]: E1008 18:11:07.733911 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.849093 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.863248 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.863778 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.865000 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.873882 4750 generic.go:334] "Generic (PLEG): container finished" podID="9745a747-29eb-473f-bdb1-b526e1fe1445" containerID="ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32" exitCode=0 Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.873964 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" event={"ID":"9745a747-29eb-473f-bdb1-b526e1fe1445","Type":"ContainerDied","Data":"ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32"} Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.881118 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.881204 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.881876 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.883404 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094"} Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.886989 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6jln4" event={"ID":"58294711-df24-43b8-b1b6-6617d55d49b3","Type":"ContainerStarted","Data":"a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62"} Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.887694 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6jln4" event={"ID":"58294711-df24-43b8-b1b6-6617d55d49b3","Type":"ContainerStarted","Data":"8cc193260bd71c2f5d99f6f5a1e19ac5d65de68c295610deb9aee20d1a8dc5cc"} Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.893629 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.907424 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.918246 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.931254 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.942489 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.956117 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.975968 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:07 crc kubenswrapper[4750]: I1008 18:11:07.990009 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:07Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.002400 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.017335 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.032327 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.051644 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.068413 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.084722 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.101303 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.113772 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.126516 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.139752 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.153603 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.179651 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.194089 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.205324 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.226841 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.241698 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.269816 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.301632 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.340426 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.356074 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.356233 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356270 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:11:12.35624069 +0000 UTC m=+28.269211703 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.356374 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356388 4750 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.356432 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356466 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:12.356445235 +0000 UTC m=+28.269416258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356544 4750 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356626 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356646 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356661 4750 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356665 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:12.35664459 +0000 UTC m=+28.269615643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.356709 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:12.356701151 +0000 UTC m=+28.269672164 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.457353 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.457609 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.457648 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.457661 4750 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.457734 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:12.457716949 +0000 UTC m=+28.370687962 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.733228 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.733275 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.733401 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:08 crc kubenswrapper[4750]: E1008 18:11:08.733545 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.895159 4750 generic.go:334] "Generic (PLEG): container finished" podID="9745a747-29eb-473f-bdb1-b526e1fe1445" containerID="341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911" exitCode=0 Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.895461 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" event={"ID":"9745a747-29eb-473f-bdb1-b526e1fe1445","Type":"ContainerDied","Data":"341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911"} Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.925487 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.953927 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:08 crc kubenswrapper[4750]: I1008 18:11:08.986203 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:08Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.003524 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.022137 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.036028 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.053314 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.074756 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.090183 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.104924 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.120620 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.136806 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.153965 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.169347 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.184827 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.733688 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:09 crc kubenswrapper[4750]: E1008 18:11:09.734144 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.903000 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.907094 4750 generic.go:334] "Generic (PLEG): container finished" podID="9745a747-29eb-473f-bdb1-b526e1fe1445" containerID="53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94" exitCode=0 Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.907147 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" event={"ID":"9745a747-29eb-473f-bdb1-b526e1fe1445","Type":"ContainerDied","Data":"53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94"} Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.919802 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.935277 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.950977 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.973590 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:09 crc kubenswrapper[4750]: I1008 18:11:09.989989 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:09Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.004062 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.014562 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.027732 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.037095 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.053864 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.070991 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.082849 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.096951 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.111168 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.123866 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.162893 4750 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.166292 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.166353 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.166362 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.166465 4750 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.173633 4750 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.174025 4750 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.179062 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.179096 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.179108 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.179130 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.179219 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: E1008 18:11:10.197654 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.202653 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.202699 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.202711 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.202724 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.202733 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: E1008 18:11:10.221422 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.228393 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.228442 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.228455 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.228476 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.228490 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: E1008 18:11:10.243192 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.248998 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.249035 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.249047 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.249065 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.249078 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: E1008 18:11:10.268726 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.273529 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.273582 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.273592 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.273610 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.273621 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: E1008 18:11:10.291166 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: E1008 18:11:10.291290 4750 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.293343 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.293380 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.293392 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.293409 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.293421 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.395710 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.395759 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.395772 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.395789 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.395799 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.497957 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.497995 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.498005 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.498018 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.498028 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.600742 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.600823 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.600841 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.600876 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.600894 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.703395 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.703490 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.703510 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.703540 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.703607 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.733716 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.733838 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:10 crc kubenswrapper[4750]: E1008 18:11:10.733881 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:10 crc kubenswrapper[4750]: E1008 18:11:10.734070 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.806603 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.806676 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.806695 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.806723 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.806742 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.909322 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.909373 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.909386 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.909409 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.909422 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:10Z","lastTransitionTime":"2025-10-08T18:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.917163 4750 generic.go:334] "Generic (PLEG): container finished" podID="9745a747-29eb-473f-bdb1-b526e1fe1445" containerID="2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd" exitCode=0 Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.917225 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" event={"ID":"9745a747-29eb-473f-bdb1-b526e1fe1445","Type":"ContainerDied","Data":"2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd"} Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.936391 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.954396 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.971692 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:10 crc kubenswrapper[4750]: I1008 18:11:10.985213 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.000127 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:10Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.016334 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.016405 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.016423 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.016448 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.016470 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.016691 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.035400 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.053803 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.077105 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.090478 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.104534 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.119965 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.120012 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.120026 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.120044 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.120060 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.127076 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.138864 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.157518 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.168394 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.222205 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.222249 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.222260 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.222275 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.222317 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.326449 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.326503 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.326516 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.326536 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.326585 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.428650 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.428685 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.428696 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.428727 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.428736 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.532271 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.532323 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.532336 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.532353 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.532365 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.665147 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.665193 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.665205 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.665222 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.665234 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.733170 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:11 crc kubenswrapper[4750]: E1008 18:11:11.733289 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.767173 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.767240 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.767254 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.767278 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.767293 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.870320 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.870406 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.870432 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.870466 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.870490 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.933435 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" event={"ID":"9745a747-29eb-473f-bdb1-b526e1fe1445","Type":"ContainerStarted","Data":"4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.938681 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.939207 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.950813 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.966509 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.973279 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.973308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.973317 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.973331 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.973340 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:11Z","lastTransitionTime":"2025-10-08T18:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.975630 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:11 crc kubenswrapper[4750]: I1008 18:11:11.981458 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:11Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.007627 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.020579 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.033032 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.045200 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.057733 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.070427 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.075156 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.075209 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.075222 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.075240 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.075251 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.082633 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.094768 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.109862 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.124640 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.138673 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.166675 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.178225 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.178266 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.178277 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.178294 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.178306 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.178207 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.192101 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.207586 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.225956 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.238523 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.252026 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.265775 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.278608 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.280035 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.280076 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.280107 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.280125 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.280134 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.294470 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.308467 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.333990 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.349782 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.366010 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.383461 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.383513 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.383528 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.383570 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.383584 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.386584 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.400832 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.400928 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.400991 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.401023 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401084 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:11:20.401029669 +0000 UTC m=+36.314000682 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401114 4750 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401166 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401189 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401204 4750 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401210 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:20.401189913 +0000 UTC m=+36.314160946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401123 4750 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401264 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:20.401243374 +0000 UTC m=+36.314214437 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.401297 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:20.401284375 +0000 UTC m=+36.314255488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.402135 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.485959 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.486009 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.486021 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.486041 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.486052 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.502158 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.502345 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.502365 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.502375 4750 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.502428 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:20.502413906 +0000 UTC m=+36.415384919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.589037 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.589082 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.589093 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.589110 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.589121 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.694148 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.694326 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.694354 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.697113 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.697136 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.734086 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.734105 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.734338 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:12 crc kubenswrapper[4750]: E1008 18:11:12.735008 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.802327 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.802389 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.802407 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.802435 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.802455 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.905852 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.905942 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.905968 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.906007 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.906051 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:12Z","lastTransitionTime":"2025-10-08T18:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.942943 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.943027 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.977193 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:12 crc kubenswrapper[4750]: I1008 18:11:12.992859 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.007294 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.009174 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.009231 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.009247 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.009268 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.009282 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.024828 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.042317 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.061645 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.079204 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.103025 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.117350 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.117404 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.117417 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.117441 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.117455 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.145984 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.160539 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.188024 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.208330 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.221347 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.221394 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.221407 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.221428 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.221442 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.230687 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.246432 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.259232 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.274053 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:13Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.325056 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.325095 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.325103 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.325118 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.325127 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.432180 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.432289 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.432304 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.432324 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.432342 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.534843 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.534881 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.534889 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.534904 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.534930 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.637345 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.637392 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.637401 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.637415 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.637424 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.733626 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:13 crc kubenswrapper[4750]: E1008 18:11:13.733741 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.739986 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.740022 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.740031 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.740046 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.740056 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.842289 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.842341 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.842352 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.842370 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.842382 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.968102 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.968181 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.968193 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.968235 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:13 crc kubenswrapper[4750]: I1008 18:11:13.968247 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:13Z","lastTransitionTime":"2025-10-08T18:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.070198 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.070239 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.070248 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.070289 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.070307 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.172443 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.172483 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.172494 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.172513 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.172525 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.274622 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.274657 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.274666 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.274681 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.274690 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.376868 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.376915 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.376935 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.376954 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.376966 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.479616 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.479649 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.479657 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.479670 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.479679 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.582372 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.582413 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.582422 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.582438 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.582454 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.684919 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.684968 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.684979 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.684993 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.685003 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.733674 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.733674 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:14 crc kubenswrapper[4750]: E1008 18:11:14.733796 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:14 crc kubenswrapper[4750]: E1008 18:11:14.733890 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.747471 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.758440 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.770415 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.778993 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.786513 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.786562 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.786572 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.786586 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.786604 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.793871 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.806413 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.824793 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.834785 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.852520 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.866048 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.876594 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.887809 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.888611 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.888675 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.888689 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.888708 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.888745 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.898238 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.909926 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.926614 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.972523 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/0.log" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.974709 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c" exitCode=1 Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.974752 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.975387 4750 scope.go:117] "RemoveContainer" containerID="9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.986877 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.991625 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.991672 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.991682 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.991740 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.991752 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:14Z","lastTransitionTime":"2025-10-08T18:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:14 crc kubenswrapper[4750]: I1008 18:11:14.999450 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.012513 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.041158 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:14Z\\\",\\\"message\\\":\\\"ressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 18:11:14.110102 6030 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 18:11:14.110315 6030 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 18:11:14.110604 6030 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 18:11:14.110918 6030 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 18:11:14.111056 6030 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 18:11:14.111077 6030 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 18:11:14.111091 6030 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 18:11:14.111104 6030 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 18:11:14.111113 6030 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 18:11:14.111124 6030 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 18:11:14.111163 6030 factory.go:656] Stopping watch factory\\\\nI1008 18:11:14.111197 6030 ovnkube.go:599] Stopped ovnkube\\\\nI1008 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.054955 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.070914 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.085188 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.094041 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.094062 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.094089 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.094102 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.094111 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.099320 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.110642 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.131874 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.147668 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.161896 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.174860 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.189560 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.196767 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.196815 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.196825 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.196838 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.196847 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.202064 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.299859 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.299943 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.299972 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.300027 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.300045 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.402625 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.402670 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.402683 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.402706 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.402721 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.505342 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.505383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.505394 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.505412 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.505423 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.608271 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.608307 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.608318 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.608333 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.608344 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.710377 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.710427 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.710437 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.710452 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.710464 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.733736 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:15 crc kubenswrapper[4750]: E1008 18:11:15.733877 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.812932 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.812984 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.812997 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.813018 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.813030 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.915430 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.915748 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.915756 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.915771 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.915782 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:15Z","lastTransitionTime":"2025-10-08T18:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.979489 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/1.log" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.980272 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/0.log" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.982316 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2" exitCode=1 Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.982353 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2"} Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.982394 4750 scope.go:117] "RemoveContainer" containerID="9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.982969 4750 scope.go:117] "RemoveContainer" containerID="1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2" Oct 08 18:11:15 crc kubenswrapper[4750]: E1008 18:11:15.983195 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:11:15 crc kubenswrapper[4750]: I1008 18:11:15.999776 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:15Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.018046 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.018076 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.018084 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.018097 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.018106 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.018986 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.040350 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.061177 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:14Z\\\",\\\"message\\\":\\\"ressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 18:11:14.110102 6030 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 18:11:14.110315 6030 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 18:11:14.110604 6030 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 18:11:14.110918 6030 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 18:11:14.111056 6030 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 18:11:14.111077 6030 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 18:11:14.111091 6030 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 18:11:14.111104 6030 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 18:11:14.111113 6030 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 18:11:14.111124 6030 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 18:11:14.111163 6030 factory.go:656] Stopping watch factory\\\\nI1008 18:11:14.111197 6030 ovnkube.go:599] Stopped ovnkube\\\\nI1008 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.084003 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.099103 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.114098 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.120531 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.120592 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.120604 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.120620 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.120630 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.133593 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.148786 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.168216 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.187911 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.239881 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.239924 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.239933 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.239948 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.239957 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.244010 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.258368 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.271091 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.287428 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.342672 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.342710 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.342718 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.342732 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.342743 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.445399 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.445450 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.445460 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.445475 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.445487 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.548483 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.548570 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.548582 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.548600 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.548610 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.651813 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.651871 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.651884 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.651902 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.651913 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.733924 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.733960 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:16 crc kubenswrapper[4750]: E1008 18:11:16.734070 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:16 crc kubenswrapper[4750]: E1008 18:11:16.734213 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.754653 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.754689 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.754700 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.754714 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.754723 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.858042 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.858086 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.858096 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.858114 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.858125 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.960855 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.960948 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.960975 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.961005 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.961025 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:16Z","lastTransitionTime":"2025-10-08T18:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.962522 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt"] Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.962958 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.965733 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.966712 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.983855 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.988036 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/1.log" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.993444 4750 scope.go:117] "RemoveContainer" containerID="1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2" Oct 08 18:11:16 crc kubenswrapper[4750]: E1008 18:11:16.993856 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:11:16 crc kubenswrapper[4750]: I1008 18:11:16.998914 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:16Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.016544 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.028028 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.038958 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.052629 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.064452 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.064506 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.064520 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.064539 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.064578 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.069325 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.089327 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.099282 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vtgh\" (UniqueName: \"kubernetes.io/projected/2b11eea9-c866-4055-abed-9955637179b7-kube-api-access-8vtgh\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.099425 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b11eea9-c866-4055-abed-9955637179b7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.099766 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b11eea9-c866-4055-abed-9955637179b7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.099845 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b11eea9-c866-4055-abed-9955637179b7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.113658 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d0b9e5ff6544c9be0712ec8bdc160d75d061f28ab5e80aeea4cc6f74bf77e4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:14Z\\\",\\\"message\\\":\\\"ressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 18:11:14.110102 6030 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 18:11:14.110315 6030 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 18:11:14.110604 6030 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 18:11:14.110918 6030 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 18:11:14.111056 6030 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 18:11:14.111077 6030 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 18:11:14.111091 6030 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 18:11:14.111104 6030 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 18:11:14.111113 6030 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 18:11:14.111124 6030 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 18:11:14.111163 6030 factory.go:656] Stopping watch factory\\\\nI1008 18:11:14.111197 6030 ovnkube.go:599] Stopped ovnkube\\\\nI1008 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.127513 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.147624 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.167753 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.167991 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.168100 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.168163 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.168245 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.171535 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.187460 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.201457 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b11eea9-c866-4055-abed-9955637179b7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.201500 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b11eea9-c866-4055-abed-9955637179b7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.201525 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vtgh\" (UniqueName: \"kubernetes.io/projected/2b11eea9-c866-4055-abed-9955637179b7-kube-api-access-8vtgh\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.201562 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b11eea9-c866-4055-abed-9955637179b7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.201894 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.202375 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b11eea9-c866-4055-abed-9955637179b7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.202587 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b11eea9-c866-4055-abed-9955637179b7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.208756 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b11eea9-c866-4055-abed-9955637179b7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.221907 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.224244 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vtgh\" (UniqueName: \"kubernetes.io/projected/2b11eea9-c866-4055-abed-9955637179b7-kube-api-access-8vtgh\") pod \"ovnkube-control-plane-749d76644c-mhhrt\" (UID: \"2b11eea9-c866-4055-abed-9955637179b7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.235006 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.249661 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.267350 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.271228 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.271326 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.271383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.271446 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.271582 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.281410 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.281412 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.297850 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.315231 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.335461 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.349325 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.374405 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.374453 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.374464 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.374480 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.374490 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.378740 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.394679 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.417209 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.432327 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.443769 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.457890 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.469139 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.480169 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.480234 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.480245 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.480259 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.480268 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.489948 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.502664 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:17Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.582391 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.582440 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.582452 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.582472 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.582484 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.685012 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.685257 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.685345 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.685429 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.685542 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.733119 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:17 crc kubenswrapper[4750]: E1008 18:11:17.733244 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.788004 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.788078 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.788088 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.788101 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.788112 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.890765 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.890806 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.890816 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.890831 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.890841 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.994032 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.994072 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.994081 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.994095 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.994105 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:17Z","lastTransitionTime":"2025-10-08T18:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.997604 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" event={"ID":"2b11eea9-c866-4055-abed-9955637179b7","Type":"ContainerStarted","Data":"49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.997679 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" event={"ID":"2b11eea9-c866-4055-abed-9955637179b7","Type":"ContainerStarted","Data":"c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253"} Oct 08 18:11:17 crc kubenswrapper[4750]: I1008 18:11:17.997691 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" event={"ID":"2b11eea9-c866-4055-abed-9955637179b7","Type":"ContainerStarted","Data":"31c01f1eac78711d8b95c9645c2b70e5b544346a8f24808affb877d7c905f812"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.016222 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.035976 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.048695 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.072935 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.083812 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7f9jd"] Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.084299 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:18 crc kubenswrapper[4750]: E1008 18:11:18.084359 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.094998 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.097517 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.097583 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.097593 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.097608 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.097620 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.109048 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.126200 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.140125 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.151981 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.165210 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.177829 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.187954 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.199652 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.199698 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.199709 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.199727 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.199739 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.202726 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.211142 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.211210 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxt9\" (UniqueName: \"kubernetes.io/projected/b67ae9d5-e575-45e7-913a-01f379b86416-kube-api-access-vhxt9\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.216349 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.230672 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.260873 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.273104 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.288249 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.301218 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.301515 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.301675 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.301688 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.301710 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.301723 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.312414 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxt9\" (UniqueName: \"kubernetes.io/projected/b67ae9d5-e575-45e7-913a-01f379b86416-kube-api-access-vhxt9\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.312492 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:18 crc kubenswrapper[4750]: E1008 18:11:18.312659 4750 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:18 crc kubenswrapper[4750]: E1008 18:11:18.312721 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs podName:b67ae9d5-e575-45e7-913a-01f379b86416 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:18.812704632 +0000 UTC m=+34.725675655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs") pod "network-metrics-daemon-7f9jd" (UID: "b67ae9d5-e575-45e7-913a-01f379b86416") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.318452 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.330156 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxt9\" (UniqueName: \"kubernetes.io/projected/b67ae9d5-e575-45e7-913a-01f379b86416-kube-api-access-vhxt9\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.353728 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.368094 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.381799 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.395010 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.404173 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.404249 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.404261 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.404289 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.404307 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.407489 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.421393 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.435152 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.451486 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.470522 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.487154 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.502212 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.507194 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.507245 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.507262 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.507283 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.507297 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.521130 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.536151 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:18Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.610244 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.610330 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.610352 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.610383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.610404 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.714240 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.714316 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.714335 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.714364 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.714385 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.733985 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:18 crc kubenswrapper[4750]: E1008 18:11:18.734260 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.734364 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:18 crc kubenswrapper[4750]: E1008 18:11:18.734733 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.817625 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:18 crc kubenswrapper[4750]: E1008 18:11:18.817909 4750 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:18 crc kubenswrapper[4750]: E1008 18:11:18.817993 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs podName:b67ae9d5-e575-45e7-913a-01f379b86416 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:19.817962829 +0000 UTC m=+35.730933882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs") pod "network-metrics-daemon-7f9jd" (UID: "b67ae9d5-e575-45e7-913a-01f379b86416") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.818413 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.818448 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.818460 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.818480 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.818492 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.923856 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.923931 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.923951 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.923976 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:18 crc kubenswrapper[4750]: I1008 18:11:18.923993 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:18Z","lastTransitionTime":"2025-10-08T18:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.027891 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.027966 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.027987 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.028021 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.028046 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.131972 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.132661 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.132703 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.132747 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.132774 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.236875 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.236958 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.236979 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.237013 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.237034 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.340108 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.340308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.340332 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.340364 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.340394 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.442676 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.442765 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.442786 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.442825 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.442853 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.545540 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.545618 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.545630 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.545673 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.545686 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.649580 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.649638 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.649650 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.649672 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.649684 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.733456 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.733595 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:19 crc kubenswrapper[4750]: E1008 18:11:19.733706 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:19 crc kubenswrapper[4750]: E1008 18:11:19.733839 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.752980 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.753032 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.753046 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.753065 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.753079 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.832475 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:19 crc kubenswrapper[4750]: E1008 18:11:19.832771 4750 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:19 crc kubenswrapper[4750]: E1008 18:11:19.832913 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs podName:b67ae9d5-e575-45e7-913a-01f379b86416 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:21.83288289 +0000 UTC m=+37.745853933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs") pod "network-metrics-daemon-7f9jd" (UID: "b67ae9d5-e575-45e7-913a-01f379b86416") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.855524 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.855818 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.855836 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.855864 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.855881 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.958822 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.958863 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.958872 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.958888 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:19 crc kubenswrapper[4750]: I1008 18:11:19.958898 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:19Z","lastTransitionTime":"2025-10-08T18:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.061790 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.061854 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.061873 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.061906 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.061932 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.165390 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.165436 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.165448 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.165467 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.165480 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.268853 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.268886 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.268896 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.268911 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.268921 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.371163 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.371216 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.371227 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.371241 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.371250 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.437741 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.437842 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.437883 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.437909 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438012 4750 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438055 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438076 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438090 4750 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438137 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:36.438122009 +0000 UTC m=+52.351093022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438054 4750 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438160 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:11:36.43814906 +0000 UTC m=+52.351120073 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438193 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:36.438181891 +0000 UTC m=+52.351152904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.438288 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:36.438215651 +0000 UTC m=+52.351186874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.474617 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.474684 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.474705 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.474732 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.474751 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.540427 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.540762 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.540809 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.540835 4750 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.540990 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:36.540924181 +0000 UTC m=+52.453895364 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.582798 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.582886 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.582910 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.582941 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.582966 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.658673 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.658731 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.658744 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.658765 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.658779 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.677157 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:20Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.682004 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.682047 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.682056 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.682071 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.682081 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.694476 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:20Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.698967 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.699002 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.699014 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.699031 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.699043 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.715513 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:20Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.719211 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.719247 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.719256 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.719271 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.719282 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.731219 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:20Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.733204 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.733405 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.733808 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.733943 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.737847 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.737896 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.737908 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.737926 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.737969 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.754367 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:20Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:20 crc kubenswrapper[4750]: E1008 18:11:20.754508 4750 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.756029 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.756067 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.756082 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.756101 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.756114 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.858943 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.858982 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.858991 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.859006 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.859015 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.962049 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.962102 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.962117 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.962135 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:20 crc kubenswrapper[4750]: I1008 18:11:20.962147 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:20Z","lastTransitionTime":"2025-10-08T18:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.064202 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.064233 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.064241 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.064255 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.064263 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.166821 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.166928 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.166947 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.166984 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.167008 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.269294 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.269384 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.269403 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.269445 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.269467 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.372057 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.372099 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.372110 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.372127 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.372139 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.473969 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.474016 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.474028 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.474044 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.474056 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.576703 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.576755 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.576768 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.576785 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.576796 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.679301 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.679344 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.679355 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.679372 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.679384 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.733716 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.733800 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:21 crc kubenswrapper[4750]: E1008 18:11:21.733864 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:21 crc kubenswrapper[4750]: E1008 18:11:21.733944 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.781604 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.781819 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.781888 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.781957 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.782017 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.862731 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:21 crc kubenswrapper[4750]: E1008 18:11:21.862959 4750 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:21 crc kubenswrapper[4750]: E1008 18:11:21.863145 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs podName:b67ae9d5-e575-45e7-913a-01f379b86416 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:25.863130722 +0000 UTC m=+41.776101735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs") pod "network-metrics-daemon-7f9jd" (UID: "b67ae9d5-e575-45e7-913a-01f379b86416") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.884013 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.884079 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.884105 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.884133 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.884150 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.915857 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.936398 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:21Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.950685 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:21Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.960833 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:21Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.970011 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:21Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.979827 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:21Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.986725 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.986964 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.987474 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.987509 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.987534 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:21Z","lastTransitionTime":"2025-10-08T18:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:21 crc kubenswrapper[4750]: I1008 18:11:21.993765 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:21Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.011820 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.020895 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.028970 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.037180 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.046311 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.057218 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.067700 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.080202 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.090225 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.090267 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.090284 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.090300 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.090311 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.091827 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.105475 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.132713 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:22Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.192909 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.193122 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.193186 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.193260 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.193327 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.295360 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.295599 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.295689 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.295757 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.295822 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.398170 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.398199 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.398209 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.398223 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.398232 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.500702 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.500752 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.500764 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.500783 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.500793 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.603535 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.603848 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.603955 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.604054 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.604152 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.707253 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.707297 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.707309 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.707325 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.707336 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.733836 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.734001 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:22 crc kubenswrapper[4750]: E1008 18:11:22.734631 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:22 crc kubenswrapper[4750]: E1008 18:11:22.734455 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.809831 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.809877 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.809889 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.809906 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.809919 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.911474 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.911517 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.911529 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.911564 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:22 crc kubenswrapper[4750]: I1008 18:11:22.911577 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:22Z","lastTransitionTime":"2025-10-08T18:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.013345 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.013397 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.013411 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.013431 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.013443 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.116079 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.116175 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.116193 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.116214 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.116232 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.218978 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.219051 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.219067 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.219101 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.219116 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.322275 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.322355 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.322377 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.322409 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.322432 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.425206 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.425263 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.425272 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.425284 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.425295 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.527797 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.527845 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.527857 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.527876 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.527889 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.630741 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.630804 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.630824 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.630846 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.630861 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.733436 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.733474 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:23 crc kubenswrapper[4750]: E1008 18:11:23.733674 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.733713 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.733758 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.733772 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.733797 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.733811 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: E1008 18:11:23.733874 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.844055 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.844099 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.844134 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.844158 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.844170 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.946990 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.947038 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.947050 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.947066 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:23 crc kubenswrapper[4750]: I1008 18:11:23.947077 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:23Z","lastTransitionTime":"2025-10-08T18:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.050355 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.050412 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.050425 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.050448 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.050460 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.154073 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.154130 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.154142 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.154162 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.154176 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.256900 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.257001 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.257021 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.257051 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.257077 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.359042 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.359074 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.359082 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.359094 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.359103 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.461690 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.461754 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.461767 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.461792 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.461806 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.564071 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.564170 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.564188 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.564219 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.564245 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.667156 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.667194 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.667204 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.667219 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.667230 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.734034 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:24 crc kubenswrapper[4750]: E1008 18:11:24.734154 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.734038 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:24 crc kubenswrapper[4750]: E1008 18:11:24.734322 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.746930 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.758273 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.768991 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.769090 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.769395 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.769407 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.769424 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.769435 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.786170 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.799377 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.811524 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.822000 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.835132 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.852774 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.871736 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.871826 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.871843 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.871870 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.871887 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.874081 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.888116 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.905726 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.916980 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.933001 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.943460 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.970076 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.975484 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.975583 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.975600 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.975619 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.975631 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:24Z","lastTransitionTime":"2025-10-08T18:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:24 crc kubenswrapper[4750]: I1008 18:11:24.988699 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:24Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.078579 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.078885 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.079097 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.079210 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.079297 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.182191 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.182220 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.182229 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.182241 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.182250 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.285066 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.285090 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.285100 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.285112 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.285119 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.388457 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.388509 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.388522 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.388542 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.388580 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.491345 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.491614 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.491678 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.491774 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.491850 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.594825 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.594863 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.594876 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.594890 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.594900 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.696755 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.696797 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.696809 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.696824 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.696836 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.734057 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.734358 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:25 crc kubenswrapper[4750]: E1008 18:11:25.734538 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:25 crc kubenswrapper[4750]: E1008 18:11:25.734657 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.799085 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.799374 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.799596 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.799884 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.800036 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.902267 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.902328 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.902339 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.902356 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.902369 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:25Z","lastTransitionTime":"2025-10-08T18:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:25 crc kubenswrapper[4750]: I1008 18:11:25.902954 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:25 crc kubenswrapper[4750]: E1008 18:11:25.903144 4750 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:25 crc kubenswrapper[4750]: E1008 18:11:25.903224 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs podName:b67ae9d5-e575-45e7-913a-01f379b86416 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:33.903206565 +0000 UTC m=+49.816177578 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs") pod "network-metrics-daemon-7f9jd" (UID: "b67ae9d5-e575-45e7-913a-01f379b86416") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.004496 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.004536 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.004573 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.004596 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.004611 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.106252 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.106316 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.106333 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.106357 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.106374 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.208389 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.208525 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.208538 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.208573 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.208585 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.311620 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.311647 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.311655 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.311671 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.311684 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.413923 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.413992 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.414015 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.414045 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.414072 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.516427 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.516474 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.516486 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.516503 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.516513 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.618412 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.618450 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.618459 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.618473 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.618482 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.723844 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.723880 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.723891 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.723907 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.723918 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.733262 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.733317 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:26 crc kubenswrapper[4750]: E1008 18:11:26.733383 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:26 crc kubenswrapper[4750]: E1008 18:11:26.733454 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.826607 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.826691 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.826715 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.826746 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.826768 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.929185 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.929224 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.929234 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.929249 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:26 crc kubenswrapper[4750]: I1008 18:11:26.929265 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:26Z","lastTransitionTime":"2025-10-08T18:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.032221 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.032310 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.032340 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.032370 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.032391 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.135316 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.135389 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.135412 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.135459 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.135478 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.237826 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.237866 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.237877 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.237894 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.237907 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.340442 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.340479 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.340487 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.340500 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.340510 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.443209 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.443307 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.443321 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.443339 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.443352 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.545486 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.545536 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.545569 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.545588 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.545601 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.647605 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.647653 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.647668 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.647684 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.647697 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.733593 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.733659 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:27 crc kubenswrapper[4750]: E1008 18:11:27.733783 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:27 crc kubenswrapper[4750]: E1008 18:11:27.733910 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.751131 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.751179 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.751188 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.751203 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.751215 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.853783 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.853840 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.853856 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.853877 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.853891 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.957872 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.957932 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.957948 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.957973 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:27 crc kubenswrapper[4750]: I1008 18:11:27.957992 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:27Z","lastTransitionTime":"2025-10-08T18:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.060501 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.060604 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.060617 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.060635 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.060647 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.163401 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.163442 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.163455 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.163470 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.163482 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.265667 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.265744 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.265754 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.265773 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.265786 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.368227 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.368260 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.368269 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.368283 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.368292 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.470150 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.470199 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.470211 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.470227 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.470239 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.572428 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.572502 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.572521 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.572587 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.572608 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.674827 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.674861 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.674870 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.674884 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.674895 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.734063 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:28 crc kubenswrapper[4750]: E1008 18:11:28.734241 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.734770 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:28 crc kubenswrapper[4750]: E1008 18:11:28.734896 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.782536 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.782640 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.782657 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.782677 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.782692 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.886573 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.886631 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.886643 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.886660 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.886671 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.989635 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.989695 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.989712 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.989734 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:28 crc kubenswrapper[4750]: I1008 18:11:28.989751 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:28Z","lastTransitionTime":"2025-10-08T18:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.092160 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.092214 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.092231 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.092259 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.092277 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.196701 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.197188 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.197218 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.197244 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.197261 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.300673 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.300742 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.300760 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.300783 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.300799 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.403383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.403458 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.403480 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.403511 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.403533 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.506538 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.506665 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.506727 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.506760 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.506782 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.610666 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.610744 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.610772 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.610801 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.610822 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.714015 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.714072 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.714091 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.714114 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.714130 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.733679 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.733682 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:29 crc kubenswrapper[4750]: E1008 18:11:29.734002 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:29 crc kubenswrapper[4750]: E1008 18:11:29.733854 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.818149 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.818489 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.818797 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.818977 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.819177 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.923400 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.923488 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.923513 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.923541 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:29 crc kubenswrapper[4750]: I1008 18:11:29.923603 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:29Z","lastTransitionTime":"2025-10-08T18:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.027040 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.027113 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.027139 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.027169 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.027192 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.131029 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.131387 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.131620 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.131808 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.131957 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.234921 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.234964 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.234974 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.234990 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.235000 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.338963 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.339050 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.339074 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.339099 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.339126 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.442426 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.442473 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.442492 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.442515 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.442531 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.546069 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.546495 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.546847 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.547120 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.547598 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.651466 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.651896 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.652108 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.652329 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.652522 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.733423 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.733524 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:30 crc kubenswrapper[4750]: E1008 18:11:30.734262 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:30 crc kubenswrapper[4750]: E1008 18:11:30.734434 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.755325 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.755381 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.755399 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.755425 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.755445 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.858331 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.858371 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.858380 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.858394 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.858406 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.961270 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.961308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.961317 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.961332 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.961340 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.981924 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.981961 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.981970 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.981983 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:30 crc kubenswrapper[4750]: I1008 18:11:30.981993 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:30Z","lastTransitionTime":"2025-10-08T18:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:30 crc kubenswrapper[4750]: E1008 18:11:30.998794 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:30Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.003261 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.003294 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.003304 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.003372 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.003383 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: E1008 18:11:31.019226 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.022793 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.022824 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.022832 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.022846 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.022855 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: E1008 18:11:31.034646 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.038966 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.038997 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.039007 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.039021 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.039030 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: E1008 18:11:31.049889 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.054084 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.054110 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.054121 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.054137 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.054149 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: E1008 18:11:31.065832 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: E1008 18:11:31.065945 4750 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.067184 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.067211 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.067221 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.067234 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.067243 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.169484 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.169520 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.169528 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.169541 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.169574 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.271837 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.271860 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.271868 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.271882 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.271889 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.374655 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.374716 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.374734 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.374758 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.374779 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.413103 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.421434 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.425541 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.447602 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.466531 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.477616 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.477643 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.477652 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.477664 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.477674 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.481741 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.491378 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.500671 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.512843 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.529792 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.550143 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.572509 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.580382 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.580427 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.580440 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.580456 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.580473 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.585941 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.597954 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.607476 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.625612 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.638038 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.649586 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.661396 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:31Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.684181 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.684254 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.684268 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.684300 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.684319 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.734004 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.734009 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:31 crc kubenswrapper[4750]: E1008 18:11:31.734136 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:31 crc kubenswrapper[4750]: E1008 18:11:31.734395 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.735760 4750 scope.go:117] "RemoveContainer" containerID="1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.787236 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.787301 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.787325 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.787356 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.787378 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.890156 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.890484 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.890495 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.890511 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.890522 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.992960 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.992999 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.993011 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.993026 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:31 crc kubenswrapper[4750]: I1008 18:11:31.993037 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:31Z","lastTransitionTime":"2025-10-08T18:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.043039 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/1.log" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.046606 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.047247 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.061872 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.082447 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.099658 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.100502 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.100531 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.100542 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.100582 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.100594 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.138868 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.165826 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.181142 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.196972 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.203380 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.203452 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.203468 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.203491 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.203507 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.210038 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.230889 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.245970 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.265410 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.278868 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.290510 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.300038 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.305879 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.305913 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.305926 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.305942 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.305954 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.311872 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.322455 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.336026 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.348710 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:32Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.408741 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.408769 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.408777 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.408791 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.408799 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.510851 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.510901 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.510913 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.510958 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.510971 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.613829 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.614517 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.614575 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.614598 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.614610 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.717535 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.717602 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.717610 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.717625 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.717636 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.733934 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.733946 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:32 crc kubenswrapper[4750]: E1008 18:11:32.734075 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:32 crc kubenswrapper[4750]: E1008 18:11:32.734253 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.820537 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.820587 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.820597 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.820612 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.820624 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.922909 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.922955 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.922964 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.922982 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:32 crc kubenswrapper[4750]: I1008 18:11:32.922996 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:32Z","lastTransitionTime":"2025-10-08T18:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.025435 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.025514 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.025533 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.025603 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.025631 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.050851 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/2.log" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.051634 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/1.log" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.055304 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8" exitCode=1 Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.055363 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.055420 4750 scope.go:117] "RemoveContainer" containerID="1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.056278 4750 scope.go:117] "RemoveContainer" containerID="c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8" Oct 08 18:11:33 crc kubenswrapper[4750]: E1008 18:11:33.056624 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.078746 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.096970 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b36c495aef7af8cf2e56a97c7f408a8bb4c9a5b3b93f541f7872c1f942b25a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:15Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 18:11:15.814210 6149 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.111740 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.122425 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.128334 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.128372 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.128383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.128399 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.128411 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.137429 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.157795 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.173281 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.187277 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.205760 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.218539 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.231841 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.231885 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.231902 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.231924 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.231938 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.238065 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.253938 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.266305 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.280299 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.289814 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.303756 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.315469 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.327367 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:33Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.334882 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.334925 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.334939 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.334956 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.334966 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.440309 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.440357 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.440367 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.440385 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.440397 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.543442 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.543506 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.543523 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.543578 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.543608 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.649692 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.649761 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.649781 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.649818 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.649838 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.733827 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:33 crc kubenswrapper[4750]: E1008 18:11:33.733981 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.733841 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:33 crc kubenswrapper[4750]: E1008 18:11:33.734171 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.753182 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.753229 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.753242 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.753264 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.753280 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.857315 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.857357 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.857366 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.857384 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.857395 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.959804 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.960103 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.960177 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.960248 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.960330 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:33Z","lastTransitionTime":"2025-10-08T18:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:33 crc kubenswrapper[4750]: I1008 18:11:33.994670 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:33 crc kubenswrapper[4750]: E1008 18:11:33.994805 4750 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:33 crc kubenswrapper[4750]: E1008 18:11:33.994935 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs podName:b67ae9d5-e575-45e7-913a-01f379b86416 nodeName:}" failed. No retries permitted until 2025-10-08 18:11:49.994922024 +0000 UTC m=+65.907893037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs") pod "network-metrics-daemon-7f9jd" (UID: "b67ae9d5-e575-45e7-913a-01f379b86416") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.062578 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/2.log" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.062682 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.062764 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.062784 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.062816 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.062838 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.067603 4750 scope.go:117] "RemoveContainer" containerID="c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8" Oct 08 18:11:34 crc kubenswrapper[4750]: E1008 18:11:34.067928 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.080950 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.099594 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.121419 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.142163 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.153843 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.165515 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.165544 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.165569 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.165584 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.165595 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.168650 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.182745 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.198164 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.222430 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.241927 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.265999 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.267497 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.267537 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.267582 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.267606 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.267624 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.281012 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.291933 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.306279 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.321638 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.338672 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.354145 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.371040 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.371205 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.371255 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.371271 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.371295 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.371312 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.474667 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.474731 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.474749 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.474775 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.474793 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.578483 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.578533 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.578559 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.578578 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.578591 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.682090 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.682215 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.682237 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.682274 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.682300 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.734103 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.734145 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:34 crc kubenswrapper[4750]: E1008 18:11:34.734249 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:34 crc kubenswrapper[4750]: E1008 18:11:34.735022 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.755095 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.778884 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.789446 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.789520 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.789543 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.789611 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.789629 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.795088 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.814082 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.844853 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.871462 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.892137 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.894122 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.894189 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.894210 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.894242 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.894261 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.911428 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.927137 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.947051 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.963036 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.974896 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.985976 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.997838 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:34Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.999816 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.999868 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:34 crc kubenswrapper[4750]: I1008 18:11:34.999879 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:34.999897 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:34.999911 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:34Z","lastTransitionTime":"2025-10-08T18:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.014023 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:35Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.026935 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:35Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.046451 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:35Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.067690 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:35Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.102540 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.102657 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.102684 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.102709 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.102730 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.205959 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.206395 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.206641 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.206836 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.207182 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.311514 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.311609 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.311629 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.311661 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.311687 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.416460 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.416531 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.416584 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.416616 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.416636 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.519664 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.519727 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.519742 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.519762 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.519775 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.624830 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.625059 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.625084 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.625109 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.625129 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.728465 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.728529 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.728541 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.728575 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.728585 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.734996 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.735214 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:35 crc kubenswrapper[4750]: E1008 18:11:35.735297 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:35 crc kubenswrapper[4750]: E1008 18:11:35.736302 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.834298 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.834349 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.834358 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.834376 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.834391 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.937642 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.937723 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.937744 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.937774 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:35 crc kubenswrapper[4750]: I1008 18:11:35.937798 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:35Z","lastTransitionTime":"2025-10-08T18:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.042083 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.042196 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.042218 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.042247 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.042265 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.145821 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.145877 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.145891 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.145911 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.145924 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.249544 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.249670 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.249683 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.249716 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.249734 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.353178 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.353226 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.353237 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.353254 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.353269 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.457275 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.457323 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.457335 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.457356 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.457368 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.524208 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.524408 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.524442 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.524489 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524680 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:12:08.524639969 +0000 UTC m=+84.437610992 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524697 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524735 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524754 4750 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524696 4750 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524798 4750 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524834 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 18:12:08.524809753 +0000 UTC m=+84.437780926 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524856 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:12:08.524846694 +0000 UTC m=+84.437817717 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.524991 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:12:08.524920656 +0000 UTC m=+84.437891709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.560845 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.560903 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.560914 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.560936 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.560949 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.625649 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.625969 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.626026 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.626052 4750 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.626155 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 18:12:08.626127049 +0000 UTC m=+84.539098092 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.664457 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.664516 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.664536 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.664590 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.664612 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.734052 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.734195 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.734259 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:36 crc kubenswrapper[4750]: E1008 18:11:36.734417 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.768047 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.768100 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.768115 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.768136 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.768150 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.870948 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.871048 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.871066 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.871428 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.871741 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.975028 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.975107 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.975129 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.975155 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:36 crc kubenswrapper[4750]: I1008 18:11:36.975175 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:36Z","lastTransitionTime":"2025-10-08T18:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.077330 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.077582 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.077645 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.077889 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.078079 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.181653 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.181740 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.181793 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.181820 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.181839 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.285104 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.285445 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.285695 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.285864 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.286058 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.391194 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.391303 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.391322 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.391349 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.391369 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.495042 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.495206 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.495233 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.495265 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.495289 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.599163 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.599251 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.599270 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.599298 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.599318 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.702439 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.702513 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.702531 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.702589 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.702608 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.733966 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:37 crc kubenswrapper[4750]: E1008 18:11:37.734113 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.734399 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:37 crc kubenswrapper[4750]: E1008 18:11:37.734603 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.810708 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.811029 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.811121 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.811399 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.811486 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.914645 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.914727 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.914749 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.914781 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:37 crc kubenswrapper[4750]: I1008 18:11:37.914801 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:37Z","lastTransitionTime":"2025-10-08T18:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.017832 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.018090 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.018221 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.018308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.018370 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.120910 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.121178 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.121283 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.121360 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.121418 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.224907 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.225117 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.225181 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.225281 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.225357 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.329130 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.329216 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.329239 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.329270 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.329292 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.432785 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.432855 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.432873 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.432901 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.432920 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.536003 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.536045 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.536055 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.536070 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.536080 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.639174 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.639234 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.639244 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.639259 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.639268 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.733804 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.733817 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:38 crc kubenswrapper[4750]: E1008 18:11:38.733975 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:38 crc kubenswrapper[4750]: E1008 18:11:38.734014 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.740946 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.741017 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.741037 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.741060 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.741076 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.843914 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.843993 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.844013 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.844038 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.844058 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.946887 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.946944 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.946955 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.946970 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:38 crc kubenswrapper[4750]: I1008 18:11:38.946999 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:38Z","lastTransitionTime":"2025-10-08T18:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.049150 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.049188 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.049197 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.049211 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.049221 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.151323 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.151371 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.151385 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.151401 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.151412 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.253679 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.253725 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.253765 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.253788 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.253799 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.356001 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.356048 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.356060 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.356086 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.356097 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.458288 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.458317 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.458324 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.458336 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.458344 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.560848 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.560881 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.560891 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.560907 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.560919 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.662917 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.662958 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.662969 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.662985 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.662996 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.733938 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:39 crc kubenswrapper[4750]: E1008 18:11:39.734336 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.734040 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:39 crc kubenswrapper[4750]: E1008 18:11:39.734673 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.765599 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.765639 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.765649 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.765666 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.765676 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.867759 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.867971 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.868095 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.868188 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.868381 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.970330 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.970626 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.970707 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.970804 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:39 crc kubenswrapper[4750]: I1008 18:11:39.970877 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:39Z","lastTransitionTime":"2025-10-08T18:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.073132 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.073168 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.073179 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.073194 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.073205 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.175008 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.175066 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.175083 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.175102 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.175116 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.278015 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.278045 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.278053 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.278064 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.278073 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.380749 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.380785 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.380794 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.380807 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.380816 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.484120 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.484159 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.484168 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.484182 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.484194 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.587062 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.587110 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.587123 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.587140 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.587151 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.689717 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.689756 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.689766 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.689780 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.689790 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.733388 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.733689 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:40 crc kubenswrapper[4750]: E1008 18:11:40.733770 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:40 crc kubenswrapper[4750]: E1008 18:11:40.734006 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.792573 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.792610 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.792620 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.792653 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.792666 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.896301 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.896350 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.896363 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.896383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.896394 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.999501 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.999779 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.999841 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:40 crc kubenswrapper[4750]: I1008 18:11:40.999903 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:40.999975 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:40Z","lastTransitionTime":"2025-10-08T18:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.102200 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.102507 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.102626 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.102694 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.102747 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.204648 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.204904 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.204964 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.205053 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.205114 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.307364 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.307644 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.307747 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.307870 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.307969 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.318080 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.318122 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.318131 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.318145 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.318153 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: E1008 18:11:41.330238 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:41Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.333461 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.333601 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.333689 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.333813 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.333883 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: E1008 18:11:41.345201 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:41Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.349219 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.349326 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.349397 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.349472 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.349596 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: E1008 18:11:41.361006 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:41Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.364891 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.364959 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.364976 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.365001 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.365018 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: E1008 18:11:41.378835 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:41Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.383656 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.383821 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.383900 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.383985 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.384068 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: E1008 18:11:41.395811 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:41Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:41 crc kubenswrapper[4750]: E1008 18:11:41.396128 4750 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.410346 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.410387 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.410398 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.410414 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.410425 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.513151 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.513191 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.513203 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.513215 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.513224 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.615275 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.615428 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.615478 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.615496 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.615511 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.718169 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.718215 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.718227 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.718245 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.718260 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.733767 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:41 crc kubenswrapper[4750]: E1008 18:11:41.734000 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.733810 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:41 crc kubenswrapper[4750]: E1008 18:11:41.734204 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.820897 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.820936 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.820944 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.820959 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.820968 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.922700 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.922736 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.922755 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.922773 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:41 crc kubenswrapper[4750]: I1008 18:11:41.922785 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:41Z","lastTransitionTime":"2025-10-08T18:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.024837 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.025051 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.025110 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.025177 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.025234 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.128021 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.128298 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.128365 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.128436 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.128496 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.230946 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.230992 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.231003 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.231018 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.231029 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.333984 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.334050 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.334075 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.334104 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.334124 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.437014 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.437063 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.437076 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.437095 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.437107 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.540425 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.540510 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.540522 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.540541 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.540628 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.643863 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.644352 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.644507 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.644743 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.644915 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.734242 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.734287 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:42 crc kubenswrapper[4750]: E1008 18:11:42.735321 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:42 crc kubenswrapper[4750]: E1008 18:11:42.735579 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.748285 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.748323 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.748334 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.748352 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.748365 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.851325 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.851380 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.851395 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.851416 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.851433 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.954138 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.954189 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.954200 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.954218 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:42 crc kubenswrapper[4750]: I1008 18:11:42.954230 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:42Z","lastTransitionTime":"2025-10-08T18:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.056953 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.057002 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.057019 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.057040 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.057053 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.159870 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.159928 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.159944 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.159966 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.159981 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.262764 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.262802 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.262811 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.262825 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.262837 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.365181 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.365235 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.365244 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.365256 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.365266 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.468142 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.468203 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.468220 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.468250 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.468268 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.570864 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.570913 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.570929 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.570954 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.570972 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.673095 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.673390 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.673456 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.673530 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.673627 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.734059 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.734059 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:43 crc kubenswrapper[4750]: E1008 18:11:43.734475 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:43 crc kubenswrapper[4750]: E1008 18:11:43.734396 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.775369 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.775402 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.775410 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.775423 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.775432 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.877797 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.877839 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.877850 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.877865 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.877878 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.979569 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.979597 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.979605 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.979618 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:43 crc kubenswrapper[4750]: I1008 18:11:43.979626 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:43Z","lastTransitionTime":"2025-10-08T18:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.085645 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.085697 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.085709 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.085735 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.085751 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.188329 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.188630 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.188757 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.188854 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.188940 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.291189 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.291216 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.291224 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.291236 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.291244 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.394695 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.394773 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.394800 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.394830 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.394852 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.498301 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.498662 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.498875 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.499018 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.499145 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.602130 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.602171 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.602183 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.602199 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.602210 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.704767 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.704797 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.704806 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.704818 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.704828 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.733906 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.734052 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:44 crc kubenswrapper[4750]: E1008 18:11:44.734249 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:44 crc kubenswrapper[4750]: E1008 18:11:44.734637 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.755977 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.785634 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.800970 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.808369 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.808422 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.808441 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.808464 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.808482 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.816992 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.829172 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.842946 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.855834 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.872987 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.900757 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.910583 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.910617 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.910628 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.910644 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.910655 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:44Z","lastTransitionTime":"2025-10-08T18:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.918493 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.933422 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.946963 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.959439 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.977640 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:44 crc kubenswrapper[4750]: I1008 18:11:44.988671 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.000920 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:44Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.012768 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.012799 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.012810 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.012825 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.012837 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.016259 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:45Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.027007 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:45Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.115192 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.115250 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.115268 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.115291 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.115308 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.217717 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.217768 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.217780 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.217796 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.217807 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.321051 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.321358 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.321578 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.321838 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.321972 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.424706 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.424753 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.424767 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.424783 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.424795 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.526427 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.526462 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.526473 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.526529 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.526542 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.629646 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.629911 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.629975 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.630043 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.630106 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.732794 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.733281 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.733386 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.733478 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.733332 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.733346 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:45 crc kubenswrapper[4750]: E1008 18:11:45.733791 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.733564 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: E1008 18:11:45.733924 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.836686 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.836739 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.836751 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.836777 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.836791 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.939396 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.939480 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.939503 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.939526 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:45 crc kubenswrapper[4750]: I1008 18:11:45.939543 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:45Z","lastTransitionTime":"2025-10-08T18:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.042251 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.042293 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.042306 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.042321 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.042332 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.144803 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.144900 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.144920 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.144953 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.144979 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.248923 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.249000 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.249044 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.249087 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.249113 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.352390 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.352856 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.352906 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.352927 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.352942 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.455958 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.456048 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.456064 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.456095 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.456112 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.559163 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.559272 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.559293 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.559326 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.559352 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.661609 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.661657 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.661668 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.661683 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.661693 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.733542 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.733658 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:46 crc kubenswrapper[4750]: E1008 18:11:46.733783 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:46 crc kubenswrapper[4750]: E1008 18:11:46.733865 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.764192 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.764263 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.764286 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.764316 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.764336 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.867345 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.867382 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.867393 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.867410 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.867422 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.969438 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.969489 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.969500 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.969516 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:46 crc kubenswrapper[4750]: I1008 18:11:46.969527 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:46Z","lastTransitionTime":"2025-10-08T18:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.072730 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.072768 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.072777 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.072791 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.072803 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.175731 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.175791 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.175808 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.175831 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.175847 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.277908 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.277952 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.277969 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.277988 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.278002 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.381057 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.381113 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.381127 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.381149 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.381165 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.483612 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.483656 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.483669 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.483685 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.483698 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.587314 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.587350 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.587361 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.587383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.587443 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.689177 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.689228 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.689243 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.689264 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.689282 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.734078 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.734148 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:47 crc kubenswrapper[4750]: E1008 18:11:47.734261 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:47 crc kubenswrapper[4750]: E1008 18:11:47.734503 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.735318 4750 scope.go:117] "RemoveContainer" containerID="c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8" Oct 08 18:11:47 crc kubenswrapper[4750]: E1008 18:11:47.735508 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.791628 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.791663 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.791674 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.791689 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.791704 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.894586 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.894628 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.894640 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.894657 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.894670 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.997464 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.997509 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.997520 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.997538 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:47 crc kubenswrapper[4750]: I1008 18:11:47.997570 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:47Z","lastTransitionTime":"2025-10-08T18:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.100195 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.100281 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.100302 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.100326 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.100342 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.202716 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.202745 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.202753 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.202766 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.202774 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.305507 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.305607 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.305627 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.305705 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.305725 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.408588 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.408642 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.408654 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.408669 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.408680 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.510904 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.510937 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.510947 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.510961 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.510970 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.613007 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.613054 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.613065 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.613081 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.613092 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.715320 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.715356 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.715366 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.715398 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.715408 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.733834 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.733902 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:48 crc kubenswrapper[4750]: E1008 18:11:48.733965 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:48 crc kubenswrapper[4750]: E1008 18:11:48.734037 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.817663 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.817706 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.817715 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.817730 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.817742 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.919779 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.919839 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.919849 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.919887 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:48 crc kubenswrapper[4750]: I1008 18:11:48.919922 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:48Z","lastTransitionTime":"2025-10-08T18:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.022130 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.022164 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.022172 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.022186 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.022196 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.124739 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.124789 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.124797 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.124810 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.124819 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.226812 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.226847 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.226855 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.226869 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.226877 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.329309 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.329356 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.329392 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.329408 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.329417 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.432658 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.432705 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.432716 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.432732 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.432743 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.536736 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.536816 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.536827 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.536846 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.536863 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.639994 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.640045 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.640056 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.640074 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.640087 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.734148 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.734243 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:49 crc kubenswrapper[4750]: E1008 18:11:49.734276 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:49 crc kubenswrapper[4750]: E1008 18:11:49.734373 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.742162 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.742194 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.742203 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.742218 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.742229 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.844722 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.844759 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.844769 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.844782 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.844790 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.946900 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.946943 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.946953 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.946967 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:49 crc kubenswrapper[4750]: I1008 18:11:49.946977 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:49Z","lastTransitionTime":"2025-10-08T18:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.049251 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.049289 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.049300 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.049315 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.049324 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.062514 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:50 crc kubenswrapper[4750]: E1008 18:11:50.062666 4750 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:50 crc kubenswrapper[4750]: E1008 18:11:50.062723 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs podName:b67ae9d5-e575-45e7-913a-01f379b86416 nodeName:}" failed. No retries permitted until 2025-10-08 18:12:22.062707402 +0000 UTC m=+97.975678415 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs") pod "network-metrics-daemon-7f9jd" (UID: "b67ae9d5-e575-45e7-913a-01f379b86416") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.151867 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.151903 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.151914 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.151931 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.151941 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.254103 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.254140 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.254151 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.254167 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.254179 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.356665 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.356703 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.356713 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.356729 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.356742 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.459186 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.459238 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.459247 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.459261 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.459271 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.561358 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.561414 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.561426 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.561446 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.561460 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.663958 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.664007 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.664024 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.664041 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.664051 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.734133 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:50 crc kubenswrapper[4750]: E1008 18:11:50.734281 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.734373 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:50 crc kubenswrapper[4750]: E1008 18:11:50.734968 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.765934 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.765977 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.765989 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.766006 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.766018 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.868376 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.868417 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.868428 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.868445 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.868456 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.971139 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.971171 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.971180 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.971195 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:50 crc kubenswrapper[4750]: I1008 18:11:50.971204 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:50Z","lastTransitionTime":"2025-10-08T18:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.072810 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.072839 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.072849 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.072861 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.072871 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.174869 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.174901 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.174909 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.174921 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.174929 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.277166 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.277199 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.277211 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.277227 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.277238 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.379798 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.379842 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.379853 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.379869 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.379881 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.481894 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.481950 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.481960 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.481981 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.481993 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.584837 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.584880 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.584891 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.584907 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.584919 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.590432 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.590512 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.590531 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.590572 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.590585 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: E1008 18:11:51.605225 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:51Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.609373 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.609412 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.609420 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.609433 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.609446 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: E1008 18:11:51.620932 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:51Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.624648 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.624689 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.624698 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.624715 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.624725 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: E1008 18:11:51.636269 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:51Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.639370 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.639408 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.639420 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.639436 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.639445 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: E1008 18:11:51.650142 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:51Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.653616 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.653648 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.653685 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.653700 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.653709 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: E1008 18:11:51.664308 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:51Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:51 crc kubenswrapper[4750]: E1008 18:11:51.664455 4750 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.687003 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.687052 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.687063 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.687080 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.687093 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.733667 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.733747 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:51 crc kubenswrapper[4750]: E1008 18:11:51.733805 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:51 crc kubenswrapper[4750]: E1008 18:11:51.733883 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.789981 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.790020 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.790029 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.790043 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.790052 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.892346 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.892384 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.892396 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.892410 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.892419 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.995277 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.995325 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.995334 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.995348 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:51 crc kubenswrapper[4750]: I1008 18:11:51.995359 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:51Z","lastTransitionTime":"2025-10-08T18:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.098305 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.098343 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.098352 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.098365 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.098375 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.118260 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/0.log" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.118335 4750 generic.go:334] "Generic (PLEG): container finished" podID="cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444" containerID="bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59" exitCode=1 Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.118363 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzb5c" event={"ID":"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444","Type":"ContainerDied","Data":"bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.118810 4750 scope.go:117] "RemoveContainer" containerID="bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.136909 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.159409 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.171339 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.184790 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.195665 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.200873 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.200911 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.200924 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.200941 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.200951 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.209381 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.227801 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.241248 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.268918 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.286781 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.303388 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.303428 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.303438 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.303451 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.303461 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.305754 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.322618 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.335915 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.345372 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.355980 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.365452 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.377253 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.387784 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"2025-10-08T18:11:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd\\\\n2025-10-08T18:11:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd to /host/opt/cni/bin/\\\\n2025-10-08T18:11:06Z [verbose] multus-daemon started\\\\n2025-10-08T18:11:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T18:11:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:52Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.405562 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.405590 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.405599 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.405615 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.405624 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.508307 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.508343 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.508351 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.508366 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.508375 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.610663 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.610690 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.610697 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.610709 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.610717 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.712409 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.712482 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.712491 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.712504 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.712513 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.734081 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.734170 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:52 crc kubenswrapper[4750]: E1008 18:11:52.734209 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:52 crc kubenswrapper[4750]: E1008 18:11:52.734336 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.814905 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.814951 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.814963 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.814979 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.814990 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.917576 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.917634 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.917647 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.917663 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:52 crc kubenswrapper[4750]: I1008 18:11:52.917675 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:52Z","lastTransitionTime":"2025-10-08T18:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.019743 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.019771 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.019781 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.019795 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.019803 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.121218 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.121249 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.121257 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.121269 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.121278 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.123197 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/0.log" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.123242 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzb5c" event={"ID":"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444","Type":"ContainerStarted","Data":"da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.134014 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.144190 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.152906 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.165448 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.177934 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.187930 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.202443 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.215000 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"2025-10-08T18:11:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd\\\\n2025-10-08T18:11:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd to /host/opt/cni/bin/\\\\n2025-10-08T18:11:06Z [verbose] multus-daemon started\\\\n2025-10-08T18:11:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T18:11:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.223308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.223446 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.223458 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.223472 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.223481 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.227201 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.244460 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.260730 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.273786 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.285177 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.302616 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.313284 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.325354 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.326730 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.326811 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.326827 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.326849 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.326866 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.342700 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.359498 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:53Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.429117 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.429166 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.429175 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.429191 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.429202 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.531835 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.531870 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.531879 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.531892 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.531901 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.634203 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.634241 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.634252 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.634267 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.634278 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.733539 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.733691 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:53 crc kubenswrapper[4750]: E1008 18:11:53.733796 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:53 crc kubenswrapper[4750]: E1008 18:11:53.733963 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.736623 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.736671 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.736688 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.736708 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.736724 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.839946 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.839998 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.840012 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.840033 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.840046 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.942269 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.942308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.942318 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.942332 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:53 crc kubenswrapper[4750]: I1008 18:11:53.942342 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:53Z","lastTransitionTime":"2025-10-08T18:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.044282 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.044340 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.044349 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.044364 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.044374 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.146814 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.146855 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.146865 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.146883 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.146893 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.250217 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.250245 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.250257 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.250274 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.250282 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.353343 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.353379 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.353392 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.353409 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.353419 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.455475 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.455516 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.455525 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.455540 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.455571 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.557292 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.557353 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.557370 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.557393 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.557409 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.659436 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.659474 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.659483 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.659501 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.659513 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.733720 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.733743 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:54 crc kubenswrapper[4750]: E1008 18:11:54.733848 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:54 crc kubenswrapper[4750]: E1008 18:11:54.733978 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.746679 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.756570 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.761860 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.761904 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.761914 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.761927 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.761936 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.765435 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.774168 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.782272 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.793230 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.804037 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.815250 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"2025-10-08T18:11:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd\\\\n2025-10-08T18:11:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd to /host/opt/cni/bin/\\\\n2025-10-08T18:11:06Z [verbose] multus-daemon started\\\\n2025-10-08T18:11:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T18:11:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.835933 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.850157 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.864049 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.864126 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.864137 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.864150 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.864159 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.873399 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.884285 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.893736 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.904360 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.913601 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.924498 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.933054 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.943165 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:54Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.966829 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.966872 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.966882 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.966897 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:54 crc kubenswrapper[4750]: I1008 18:11:54.966909 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:54Z","lastTransitionTime":"2025-10-08T18:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.069331 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.069366 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.069374 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.069387 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.069395 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.170767 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.170796 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.170804 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.170816 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.170825 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.273224 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.273278 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.273296 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.273318 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.273336 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.375184 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.375219 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.375227 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.375240 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.375248 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.478090 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.478141 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.478157 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.478180 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.478196 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.580579 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.580609 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.580624 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.580645 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.580655 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.683185 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.683257 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.683274 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.683300 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.683320 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.733431 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.733493 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:55 crc kubenswrapper[4750]: E1008 18:11:55.733571 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:55 crc kubenswrapper[4750]: E1008 18:11:55.733678 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.785493 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.785537 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.785572 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.785588 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.785598 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.888007 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.888037 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.888045 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.888057 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.888066 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.990267 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.990302 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.990310 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.990323 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:55 crc kubenswrapper[4750]: I1008 18:11:55.990333 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:55Z","lastTransitionTime":"2025-10-08T18:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.092320 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.092355 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.092364 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.092376 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.092385 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.194353 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.194391 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.194400 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.194414 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.194423 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.297207 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.297248 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.297258 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.297273 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.297284 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.400017 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.400060 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.400072 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.400087 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.400099 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.504075 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.504128 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.504140 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.504157 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.504169 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.606612 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.606658 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.606670 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.606687 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.606699 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.709246 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.709334 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.709359 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.709393 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.709418 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.734070 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:56 crc kubenswrapper[4750]: E1008 18:11:56.734242 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.734344 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:56 crc kubenswrapper[4750]: E1008 18:11:56.734600 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.813034 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.813084 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.813096 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.813115 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.813129 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.915375 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.915435 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.915451 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.915474 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:56 crc kubenswrapper[4750]: I1008 18:11:56.915491 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:56Z","lastTransitionTime":"2025-10-08T18:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.017530 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.017581 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.017590 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.017602 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.017611 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.120014 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.120053 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.120061 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.120075 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.120085 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.222822 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.222856 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.222865 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.222878 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.222888 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.325163 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.325207 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.325216 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.325286 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.325298 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.428047 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.428091 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.428103 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.428119 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.428133 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.530440 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.530481 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.530491 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.530507 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.530518 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.633164 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.633222 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.633240 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.633261 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.633276 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.733667 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.733676 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:57 crc kubenswrapper[4750]: E1008 18:11:57.733805 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:57 crc kubenswrapper[4750]: E1008 18:11:57.734030 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.735617 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.735672 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.735690 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.735708 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.735723 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.838069 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.838107 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.838118 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.838135 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.838148 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.939959 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.940000 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.940013 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.940031 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:57 crc kubenswrapper[4750]: I1008 18:11:57.940044 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:57Z","lastTransitionTime":"2025-10-08T18:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.041893 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.041936 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.041949 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.041964 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.041975 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.144685 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.144726 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.144733 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.144747 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.144755 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.247922 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.247954 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.247965 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.247978 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.247987 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.350196 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.350236 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.350245 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.350261 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.350270 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.452010 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.452057 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.452071 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.452086 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.452096 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.554356 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.554409 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.554418 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.554430 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.554439 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.657309 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.657363 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.657375 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.657392 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.657405 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.734204 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.734235 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:11:58 crc kubenswrapper[4750]: E1008 18:11:58.734334 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:11:58 crc kubenswrapper[4750]: E1008 18:11:58.734431 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.734982 4750 scope.go:117] "RemoveContainer" containerID="c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.759890 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.759920 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.759927 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.759939 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.759947 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.862321 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.862645 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.862659 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.862675 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.862686 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.965481 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.965526 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.965537 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.965586 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:58 crc kubenswrapper[4750]: I1008 18:11:58.965601 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:58Z","lastTransitionTime":"2025-10-08T18:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.067954 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.067987 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.067998 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.068014 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.068024 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.140631 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/2.log" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.142840 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.143218 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.153513 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.165018 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.170668 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.170697 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.170707 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.170723 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.170733 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.174686 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.185848 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.197277 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.207886 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.220409 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.232447 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"2025-10-08T18:11:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd\\\\n2025-10-08T18:11:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd to /host/opt/cni/bin/\\\\n2025-10-08T18:11:06Z [verbose] multus-daemon started\\\\n2025-10-08T18:11:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T18:11:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.247011 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.264789 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.272780 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.272817 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.272826 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.272842 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.272853 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.317238 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.329011 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.339601 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.354317 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.366837 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.375239 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.375281 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.375289 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.375305 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.375314 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.379521 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.396253 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.408483 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:11:59Z is after 2025-08-24T17:21:41Z" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.477211 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.477249 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.477257 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.477270 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.477279 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.579913 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.579950 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.579962 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.579979 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.579988 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.682239 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.682266 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.682275 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.682289 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.682298 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.734137 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.734184 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:11:59 crc kubenswrapper[4750]: E1008 18:11:59.734251 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:11:59 crc kubenswrapper[4750]: E1008 18:11:59.734384 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.784080 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.784148 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.784169 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.784201 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.784222 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.891311 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.891372 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.891387 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.891409 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.891423 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.993908 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.993971 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.993983 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.993997 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:11:59 crc kubenswrapper[4750]: I1008 18:11:59.994008 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:11:59Z","lastTransitionTime":"2025-10-08T18:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.095834 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.095947 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.095964 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.095985 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.096000 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.146766 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/3.log" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.147375 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/2.log" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.149845 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" exitCode=1 Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.149887 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.149923 4750 scope.go:117] "RemoveContainer" containerID="c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.150513 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:12:00 crc kubenswrapper[4750]: E1008 18:12:00.150705 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.166740 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"2025-10-08T18:11:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd\\\\n2025-10-08T18:11:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd to /host/opt/cni/bin/\\\\n2025-10-08T18:11:06Z [verbose] multus-daemon started\\\\n2025-10-08T18:11:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T18:11:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.183298 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.197948 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.197976 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.197984 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.197997 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.198005 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.203195 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.225013 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6524a192625117a12120213d645f06d61fc0f54ac6e381cd5ca0b4096b6c5a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:32Z\\\",\\\"message\\\":\\\" 6372 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}\\\\nI1008 18:11:32.630778 6372 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-grddb\\\\nI1008 18:11:32.630781 6372 services_controller.go:360] Finished syncing service metrics on namespace openshift-etcd-operator for network=default : 1.21394ms\\\\nI1008 18:11:32.630785 6372 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-7f9jd] creating logical port openshift-multus_network-metrics-daemon-7f9jd for pod on switch crc\\\\nI1008 18:11:32.630495 6372 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1008 18:11:32.630790 6372 services_controller.go:356] Processing sync for service openshift-controller-manager-operator/metrics for network=default\\\\nF1008 18:11:32.630791 6372 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to star\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:59Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 18:11:59.508256 6728 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 18:11:59.508322 6728 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 18:11:59.508375 6728 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 18:11:59.508384 6728 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 18:11:59.508450 6728 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 18:11:59.508505 6728 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 18:11:59.508536 6728 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 18:11:59.508540 6728 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 18:11:59.509226 6728 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 18:11:59.509268 6728 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 18:11:59.509317 6728 factory.go:656] Stopping watch factory\\\\nI1008 18:11:59.509334 6728 ovnkube.go:599] Stopped ovnkube\\\\nI1008 18:11:59.509373 6728 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1008 18:11:59.509380 6728 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 18:11:59.509395 6728 handler.go:208] Removed *v1.Node event handler 7\\\\nF1008 18:11:59.509448 6728 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.239281 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.252970 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.264653 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.273421 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.284731 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.295512 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.299805 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.299843 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.299855 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.299873 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.299885 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.308324 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.328234 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.342640 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.353314 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.364205 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.376585 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.389473 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.402527 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.402633 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.402652 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.402677 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.402694 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.403317 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:00Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.505500 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.505537 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.505546 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.505579 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.505588 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.608252 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.608305 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.608316 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.608332 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.608341 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.710619 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.710667 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.710677 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.710699 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.710713 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.733127 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.733206 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:00 crc kubenswrapper[4750]: E1008 18:12:00.733286 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:00 crc kubenswrapper[4750]: E1008 18:12:00.733332 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.812721 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.812755 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.812764 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.812778 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.812802 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.915986 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.916055 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.916077 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.916103 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:00 crc kubenswrapper[4750]: I1008 18:12:00.916126 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:00Z","lastTransitionTime":"2025-10-08T18:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.019613 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.019649 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.019663 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.019677 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.019688 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.122238 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.122280 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.122291 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.122306 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.122317 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.156154 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/3.log" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.161645 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:12:01 crc kubenswrapper[4750]: E1008 18:12:01.162052 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.176015 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.189415 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.199898 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.222490 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.225210 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.225260 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.225276 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.225299 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.225317 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.245238 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.263075 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.277682 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.295072 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"2025-10-08T18:11:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd\\\\n2025-10-08T18:11:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd to /host/opt/cni/bin/\\\\n2025-10-08T18:11:06Z [verbose] multus-daemon started\\\\n2025-10-08T18:11:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T18:11:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.313228 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.327341 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.327377 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.327387 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.327402 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.327412 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.345349 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:59Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 18:11:59.508256 6728 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 18:11:59.508322 6728 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 18:11:59.508375 6728 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 18:11:59.508384 6728 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 18:11:59.508450 6728 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 18:11:59.508505 6728 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 18:11:59.508536 6728 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 18:11:59.508540 6728 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 18:11:59.509226 6728 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 18:11:59.509268 6728 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 18:11:59.509317 6728 factory.go:656] Stopping watch factory\\\\nI1008 18:11:59.509334 6728 ovnkube.go:599] Stopped ovnkube\\\\nI1008 18:11:59.509373 6728 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1008 18:11:59.509380 6728 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 18:11:59.509395 6728 handler.go:208] Removed *v1.Node event handler 7\\\\nF1008 18:11:59.509448 6728 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.358722 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.370452 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.382386 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.392933 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.408247 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.429003 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.429036 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.429046 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.429062 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.429073 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.440643 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.461037 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.476167 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.531958 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.531997 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.532007 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.532025 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.532036 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.634783 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.634824 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.634835 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.634854 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.634865 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.733979 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:01 crc kubenswrapper[4750]: E1008 18:12:01.734147 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.734341 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:01 crc kubenswrapper[4750]: E1008 18:12:01.734402 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.737621 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.737713 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.737727 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.737754 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.737769 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.840915 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.840959 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.840967 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.840981 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.840989 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.943380 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.943443 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.943459 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.943482 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.943497 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.975166 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.975281 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.975300 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.975338 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.975359 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:01 crc kubenswrapper[4750]: E1008 18:12:01.991185 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:01Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.995023 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.995076 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.995088 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.995108 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:01 crc kubenswrapper[4750]: I1008 18:12:01.995123 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:01Z","lastTransitionTime":"2025-10-08T18:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: E1008 18:12:02.011248 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:02Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.015774 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.015825 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.015838 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.015859 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.015870 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: E1008 18:12:02.030523 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:02Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.034082 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.034154 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.034174 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.034200 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.034219 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: E1008 18:12:02.046041 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:02Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.049870 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.049911 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.049919 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.049933 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.049943 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: E1008 18:12:02.062623 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:02Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:02 crc kubenswrapper[4750]: E1008 18:12:02.062761 4750 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.064360 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.064386 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.064396 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.064413 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.064425 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.167518 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.167590 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.167603 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.167623 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.167635 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.270509 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.270598 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.270609 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.270623 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.270635 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.373071 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.373105 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.373115 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.373132 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.373143 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.475197 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.475229 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.475239 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.475269 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.475283 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.578196 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.578229 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.578237 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.578250 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.578259 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.680902 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.680942 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.680951 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.680965 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.680974 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.733309 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.733380 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:02 crc kubenswrapper[4750]: E1008 18:12:02.733476 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:02 crc kubenswrapper[4750]: E1008 18:12:02.733673 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.783068 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.783110 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.783119 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.783136 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.783145 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.885160 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.885199 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.885210 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.885229 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.885240 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.987628 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.987751 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.987759 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.987786 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:02 crc kubenswrapper[4750]: I1008 18:12:02.987796 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:02Z","lastTransitionTime":"2025-10-08T18:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.090099 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.090145 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.090156 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.090170 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.090179 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.192295 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.192349 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.192361 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.192376 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.192413 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.294864 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.294901 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.294910 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.294924 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.294934 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.396919 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.396968 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.396979 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.396998 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.397009 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.499326 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.499388 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.499403 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.499426 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.499442 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.601931 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.601974 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.602007 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.602028 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.602039 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.704847 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.705160 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.705190 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.705220 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.705241 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.733459 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.733539 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:03 crc kubenswrapper[4750]: E1008 18:12:03.733764 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:03 crc kubenswrapper[4750]: E1008 18:12:03.733909 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.807205 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.807258 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.807283 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.807303 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.807318 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.910843 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.910901 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.910917 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.910939 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:03 crc kubenswrapper[4750]: I1008 18:12:03.910955 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:03Z","lastTransitionTime":"2025-10-08T18:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.013277 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.013311 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.013321 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.013336 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.013346 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.115402 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.115433 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.115444 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.115460 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.115471 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.217944 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.217988 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.218000 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.218016 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.218030 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.319647 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.319683 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.319693 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.319707 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.319718 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.421442 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.421496 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.421504 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.421516 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.421525 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.523993 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.524047 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.524056 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.524071 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.524081 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.626126 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.626161 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.626168 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.626181 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.626199 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.728865 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.728987 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.729015 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.729044 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.729067 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.734109 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:04 crc kubenswrapper[4750]: E1008 18:12:04.734413 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.734440 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:04 crc kubenswrapper[4750]: E1008 18:12:04.734532 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.753587 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:59Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 18:11:59.508256 6728 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 18:11:59.508322 6728 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 18:11:59.508375 6728 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 18:11:59.508384 6728 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 18:11:59.508450 6728 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 18:11:59.508505 6728 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 18:11:59.508536 6728 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 18:11:59.508540 6728 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 18:11:59.509226 6728 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 18:11:59.509268 6728 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 18:11:59.509317 6728 factory.go:656] Stopping watch factory\\\\nI1008 18:11:59.509334 6728 ovnkube.go:599] Stopped ovnkube\\\\nI1008 18:11:59.509373 6728 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1008 18:11:59.509380 6728 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 18:11:59.509395 6728 handler.go:208] Removed *v1.Node event handler 7\\\\nF1008 18:11:59.509448 6728 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.774050 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.794118 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.807161 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.819527 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.832610 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.832675 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.832689 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.832706 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.832718 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.832891 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.842952 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.858231 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.868412 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.879913 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.892575 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.903172 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.913767 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.923450 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.933227 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.934785 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.934814 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.934822 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.934838 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.934847 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:04Z","lastTransitionTime":"2025-10-08T18:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.946618 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.958369 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:04 crc kubenswrapper[4750]: I1008 18:12:04.969376 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"2025-10-08T18:11:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd\\\\n2025-10-08T18:11:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd to /host/opt/cni/bin/\\\\n2025-10-08T18:11:06Z [verbose] multus-daemon started\\\\n2025-10-08T18:11:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T18:11:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:04Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.039429 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.039465 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.039476 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.039496 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.039505 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.141501 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.141531 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.141573 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.141587 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.141594 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.243332 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.243372 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.243384 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.243400 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.243412 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.346247 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.346290 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.346301 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.346317 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.346328 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.449002 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.449047 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.449056 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.449070 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.449085 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.551055 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.551286 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.551363 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.551629 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.551701 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.653868 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.654334 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.654523 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.654747 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.654906 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.733978 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.734097 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:05 crc kubenswrapper[4750]: E1008 18:12:05.734231 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:05 crc kubenswrapper[4750]: E1008 18:12:05.734366 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.756852 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.757054 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.757118 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.757184 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.757264 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.859539 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.859597 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.859605 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.859618 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.859626 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.961680 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.961726 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.961740 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.961757 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:05 crc kubenswrapper[4750]: I1008 18:12:05.961770 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:05Z","lastTransitionTime":"2025-10-08T18:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.064349 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.064637 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.064741 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.064833 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.064920 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.167280 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.167539 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.167668 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.167785 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.167939 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.270316 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.270351 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.270360 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.270372 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.270380 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.372341 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.372375 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.372382 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.372394 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.372402 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.474055 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.474088 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.474096 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.474108 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.474132 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.576782 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.576821 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.576833 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.576850 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.576863 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.680028 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.680102 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.680129 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.680161 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.680181 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.733837 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:06 crc kubenswrapper[4750]: E1008 18:12:06.734003 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.734120 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:06 crc kubenswrapper[4750]: E1008 18:12:06.734365 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.783858 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.783940 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.783962 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.783994 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.784013 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.886011 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.886072 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.886089 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.886116 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.886134 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.989216 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.989262 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.989270 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.989288 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:06 crc kubenswrapper[4750]: I1008 18:12:06.989299 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:06Z","lastTransitionTime":"2025-10-08T18:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.092062 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.092093 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.092101 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.092114 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.092123 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.194298 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.194338 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.194347 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.194359 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.194368 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.297512 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.298074 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.298308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.298523 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.298802 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.402958 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.403034 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.403060 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.403100 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.403125 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.506274 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.506343 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.506361 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.506387 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.506405 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.610146 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.610238 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.610256 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.610281 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.610298 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.713660 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.713732 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.713751 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.713775 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.713787 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.733770 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.733794 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:07 crc kubenswrapper[4750]: E1008 18:12:07.734052 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:07 crc kubenswrapper[4750]: E1008 18:12:07.734370 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.817143 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.817183 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.817192 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.817206 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.817216 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.920426 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.920484 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.920499 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.920516 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:07 crc kubenswrapper[4750]: I1008 18:12:07.920528 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:07Z","lastTransitionTime":"2025-10-08T18:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.023304 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.023376 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.023395 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.023427 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.023450 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.128436 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.128958 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.128981 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.129012 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.129033 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.233152 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.233193 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.233202 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.233219 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.233230 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.336302 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.336383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.336403 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.336434 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.336455 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.443536 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.443608 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.443621 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.443638 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.443649 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.544119 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.544419 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.544447 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.544412577 +0000 UTC m=+148.457383610 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.544582 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.544631 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.544659 4750 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.544759 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.544730525 +0000 UTC m=+148.457701568 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.544841 4750 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.545017 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.544968032 +0000 UTC m=+148.457939085 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.544873 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.545268 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.545286 4750 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.545340 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.545327251 +0000 UTC m=+148.458298274 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.546878 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.546943 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.546967 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.547002 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.547026 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.646128 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.646272 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.646287 4750 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.646299 4750 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.646357 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.646340937 +0000 UTC m=+148.559311950 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.650310 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.650376 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.650392 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.650412 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.650425 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.733657 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.733711 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.733795 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:08 crc kubenswrapper[4750]: E1008 18:12:08.733902 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.752170 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.752250 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.752268 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.752296 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.752317 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.855799 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.855867 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.855891 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.855922 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.855946 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.959360 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.959419 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.959430 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.959449 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:08 crc kubenswrapper[4750]: I1008 18:12:08.959461 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:08Z","lastTransitionTime":"2025-10-08T18:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.062036 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.062102 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.062121 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.062149 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.062177 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.164655 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.164727 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.164740 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.164764 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.164777 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.267966 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.268049 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.268073 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.268106 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.268127 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.371433 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.371478 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.371486 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.371505 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.371517 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.474891 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.474933 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.474951 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.474969 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.474981 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.577714 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.577766 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.577776 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.577791 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.577802 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.679906 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.679953 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.679964 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.679978 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.679989 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.733870 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.733911 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:09 crc kubenswrapper[4750]: E1008 18:12:09.734053 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:09 crc kubenswrapper[4750]: E1008 18:12:09.734162 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.782650 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.782717 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.782732 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.782753 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.782767 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.885626 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.885667 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.885681 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.885696 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.885706 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.989326 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.989387 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.989398 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.989413 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:09 crc kubenswrapper[4750]: I1008 18:12:09.989424 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:09Z","lastTransitionTime":"2025-10-08T18:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.092385 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.092443 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.092455 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.092486 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.092499 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.195358 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.195417 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.195432 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.195454 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.195468 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.298498 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.298605 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.298625 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.298649 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.298668 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.401810 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.401863 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.401876 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.401897 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.401910 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.505336 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.505413 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.505439 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.505512 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.505537 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.609095 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.609152 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.609168 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.609191 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.609208 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.712902 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.713703 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.713862 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.713935 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.713959 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.733507 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.733527 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:10 crc kubenswrapper[4750]: E1008 18:12:10.733745 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:10 crc kubenswrapper[4750]: E1008 18:12:10.733858 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.818104 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.818159 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.818172 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.818194 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.818210 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.921006 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.921075 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.921100 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.921133 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:10 crc kubenswrapper[4750]: I1008 18:12:10.921193 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:10Z","lastTransitionTime":"2025-10-08T18:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.024167 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.024214 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.024226 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.024244 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.024257 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.127940 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.127995 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.128009 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.128028 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.128044 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.230851 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.230884 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.230894 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.230910 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.230931 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.334722 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.335083 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.335261 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.335416 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.335582 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.438601 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.438654 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.438672 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.438697 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.438715 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.540902 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.540935 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.540943 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.540955 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.540965 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.643178 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.643214 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.643224 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.643240 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.643249 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.733780 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:11 crc kubenswrapper[4750]: E1008 18:12:11.733934 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.733817 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:11 crc kubenswrapper[4750]: E1008 18:12:11.734121 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.746876 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.746903 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.746910 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.746924 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.746934 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.849231 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.849286 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.849298 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.849316 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.849328 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.952342 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.952391 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.952408 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.952428 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:11 crc kubenswrapper[4750]: I1008 18:12:11.952440 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:11Z","lastTransitionTime":"2025-10-08T18:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.056584 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.056660 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.056684 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.056715 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.056739 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.164931 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.164987 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.164999 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.165017 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.165031 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.267947 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.268026 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.268044 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.268074 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.268097 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.372264 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.372356 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.372377 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.372430 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.372447 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.404187 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.404245 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.404256 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.404275 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.404286 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.420122 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.425269 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.425338 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.425351 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.425373 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.425407 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.446767 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.451403 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.451447 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.451457 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.451472 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.451481 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.472065 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.477234 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.477305 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.477321 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.477349 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.477370 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.497792 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.502324 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.502384 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.502404 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.502433 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.502454 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.523464 4750 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T18:12:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bfc00a00-da5b-4621-a04e-e20b47fefa95\\\",\\\"systemUUID\\\":\\\"4099aa14-e8f7-4bff-9f81-40284b959bbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:12Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.523623 4750 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.525448 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.525530 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.525562 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.525584 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.525599 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.628851 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.628894 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.628905 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.628926 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.628938 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.732106 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.732184 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.732206 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.732235 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.732257 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.733322 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.733460 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.734518 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.734899 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.735046 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:12 crc kubenswrapper[4750]: E1008 18:12:12.735325 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.836130 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.836186 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.836199 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.836217 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.836230 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.940500 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.940581 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.940596 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.940620 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:12 crc kubenswrapper[4750]: I1008 18:12:12.940639 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:12Z","lastTransitionTime":"2025-10-08T18:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.043335 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.043398 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.043414 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.043439 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.043452 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.145767 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.145822 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.145839 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.145858 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.145867 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.248388 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.248441 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.248455 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.248476 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.248494 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.351618 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.351702 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.351719 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.351743 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.351762 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.454995 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.455053 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.455067 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.455090 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.455106 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.558737 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.558816 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.558842 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.558874 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.558892 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.662077 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.662130 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.662150 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.662172 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.662189 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.733938 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:13 crc kubenswrapper[4750]: E1008 18:12:13.734360 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.734838 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:13 crc kubenswrapper[4750]: E1008 18:12:13.734933 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.765521 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.765592 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.765603 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.765621 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.765632 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.868308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.868353 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.868366 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.868384 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.868398 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.971093 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.971139 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.971149 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.971164 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:13 crc kubenswrapper[4750]: I1008 18:12:13.971173 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:13Z","lastTransitionTime":"2025-10-08T18:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.073841 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.073884 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.073893 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.073908 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.073916 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.176596 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.176657 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.176671 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.176691 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.176704 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.279649 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.279697 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.279709 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.279725 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.279737 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.382621 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.382668 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.382679 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.382697 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.382710 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.509057 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.509100 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.509116 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.509133 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.509144 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.611017 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.611049 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.611057 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.611072 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.611082 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.713857 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.713892 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.713901 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.713914 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.713923 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.733514 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:14 crc kubenswrapper[4750]: E1008 18:12:14.736055 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.735632 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:14 crc kubenswrapper[4750]: E1008 18:12:14.736384 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.745112 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.748812 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39b4e601279d1f37ba24fbb347b7f36255512601fd47c31a3bc686f2385cd88f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e4db49275949020687bd8768b8a7745b5fdea4f5fb230d872b2a8ed9d6ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.757880 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6jln4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58294711-df24-43b8-b1b6-6617d55d49b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9a2cf606874110c681c7bbec359607eda7cabce0a44504ff21374021fa21a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78c7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6jln4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.767448 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83871b97-e7ad-4ee9-ab16-4e471c8dc3ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27de5d0a4e7203c8a6861925143be197b902b5c8d5031e26d1b732a31173c98b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d43e2deb74fdfedb40fdcc9b7f57f54f447c4f1493ece80186adb8e4624bbe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d72f137ccea9900279883c5b727b663c3ab23b09d2eab31abe8c6e4c536b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://155237e5ad4510cb1c106638bf4e87bb33102aed3005e5831f5f4232d0698ac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.784866 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3096c4e9-1e62-46ee-8e9d-9d8f5c3d8f97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53389f30a75dc05f6b778b862fec384a0aefe7bd4b7884aad3af144bc8e5b87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7428f31e2b12470cc8d9fb98bb66e19ce665d028dae0b97adf318006043e2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77be59fa3ca71194c19f0050856d30f3b6f7122bd68e07535d0a8ca6cc299832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89305f32867a4d321e16ce84e9d4e34ec3bd7594db3700a4252ad0f874dd6d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5029feed20c32b59a88eba1f7597a6689d99d93df97f2c5c94f4153de00170f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cac5e645240ce37b18a02c37268987b85de9a41a25f411686702fac1b6cef59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ebfa8466a13d106411a51b77f8c40c43a6cb916bfb400cb412988c71a28bb96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a89116eeadff52f2aefbc2a5dc102fa62ea8c91dfda9549b46f0306843c8033b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.796847 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dbbc4357a7bebc6b495e34c1eac5bd8fb1a52a16783ce31c8f3e78614a2dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.807712 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.816543 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.816609 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.816624 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.816645 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.816660 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.818348 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.827017 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hdvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31a78d19-68f8-46b7-9c53-2bbda2930e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810ea3f875bc348e876d76c92a9ecb7e7541afb42bb26556051d021085c107c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84vv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hdvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.838189 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f40f5c0-574a-4dd1-a4dc-024eba0628e5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f90b8eb6984a4ced369d75be15a4980f41e7ca1bd3c79a64a79c974da9dc2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eecb6b3a56b80f2145ca258b9fc1473c42b44a4ded70f9ddc793868c3656191\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8961dba04ac6b2092efe4568b7f5d5afb5df6833c208d9484410ed6f8dcb13a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d9df9b0a29bcccaeb1dcbdff5c85068802f2e46f39f39a63d09fe1386996c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b4f554ff77b65e8ff5a3cb8221e1c238434da5d49d308ec6684fcb2d333384\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 18:10:58.297055 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 18:10:58.298595 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1100232334/tls.crt::/tmp/serving-cert-1100232334/tls.key\\\\\\\"\\\\nI1008 18:11:03.775470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 18:11:03.786183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 18:11:03.786202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 18:11:03.786224 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 18:11:03.786230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 18:11:03.794993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1008 18:11:03.795026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 18:11:03.795033 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1008 18:11:03.795023 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 18:11:03.795038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 18:11:03.795061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 18:11:03.795066 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 18:11:03.795070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 18:11:03.797195 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ece7dde88eaa94bbf9401909374a83d5841b24deb3f494aa3c709d055f6997e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1195913aaeeafabe32e6a4fc6b19a5ab801181598dacbe14a0ff138c71330c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:10:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.851591 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b53212dc-64c8-4b45-b425-b0bb2df4be9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:10:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535706571aabece4fcb32be30ca7e8e4732194bfe3511685909161fa0c06b1c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55916d7bc976fac242d0c48e65a606ba0f642461b47262b4b0d46076163810d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a005895ee7ff0fde6900a7b6f6b8928aad001f1aaa7ce650d77c11ce7acc72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba40cf6292b171161b700726beb7b9546e2a239de4076618f1eacf17b75631fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:10:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:10:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.864225 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fd0b357c32562409dc3531c04e684de9fdd4bf05080368ff3537aef9324094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.875482 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f652a41fbc65313e94232c1b72bd7ccc0f53d7c510287ac9619ab713f770fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9c8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-grddb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.884454 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b11eea9-c866-4055-abed-9955637179b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7fa876044092370bd4f89fae3df8cd06b7d27515a90ecc0dd4393c870854253\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e7f95a79c34d04ab7881b283c072184f575623df24dadf0b1bc71c05bbcb92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mhhrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.893405 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b67ae9d5-e575-45e7-913a-01f379b86416\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhxt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7f9jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.903258 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.913205 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzb5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:51Z\\\",\\\"message\\\":\\\"2025-10-08T18:11:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd\\\\n2025-10-08T18:11:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61ae17c4-6c37-4b05-b48a-241134d993dd to /host/opt/cni/bin/\\\\n2025-10-08T18:11:06Z [verbose] multus-daemon started\\\\n2025-10-08T18:11:06Z [verbose] Readiness Indicator file check\\\\n2025-10-08T18:11:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzb5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.919001 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.919047 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.919059 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.919075 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.919086 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:14Z","lastTransitionTime":"2025-10-08T18:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.925063 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9745a747-29eb-473f-bdb1-b526e1fe1445\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4026e6054000f0175162939f647586ff94806c727b2b80a9437767275bd819d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d259615ac5009ca4f94267c606fbcca0fead3822b772887c8eac89fab5a9eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5a7f0efc5fab10ad33f6b4ed453fd11edfeff71f3f707ccce0afa406665941a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d57aa58697625be184919b6be2748d056b03f143911e2f97f3e4954f48e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://341954d6a4434479c303da195eec73cbf7bdbfc95a455310eef47deaf8300911\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c6fd6c4cd2ebe1bf15474f03711666ba641fd7d86baaf8c37f26698baa1e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbeecda4a86ddd0ca1d7a26c7a3914a3bec2251d56138737a1bec981ce67ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q4hmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2x8kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:14 crc kubenswrapper[4750]: I1008 18:12:14.942502 4750 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25d63a44-9fd7-4c19-8715-6ddec94d1806\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T18:11:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T18:11:59Z\\\",\\\"message\\\":\\\".go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 18:11:59.508256 6728 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 18:11:59.508322 6728 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 18:11:59.508375 6728 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 18:11:59.508384 6728 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 18:11:59.508450 6728 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 18:11:59.508505 6728 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 18:11:59.508536 6728 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 18:11:59.508540 6728 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 18:11:59.509226 6728 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 18:11:59.509268 6728 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 18:11:59.509317 6728 factory.go:656] Stopping watch factory\\\\nI1008 18:11:59.509334 6728 ovnkube.go:599] Stopped ovnkube\\\\nI1008 18:11:59.509373 6728 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1008 18:11:59.509380 6728 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 18:11:59.509395 6728 handler.go:208] Removed *v1.Node event handler 7\\\\nF1008 18:11:59.509448 6728 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T18:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T18:11:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T18:11:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sxl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T18:11:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl7f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T18:12:14Z is after 2025-08-24T17:21:41Z" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.021306 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.021402 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.021413 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.021427 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.021436 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.123270 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.123299 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.123308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.123321 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.123330 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.225302 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.225329 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.225338 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.225352 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.225360 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.327575 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.327634 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.327644 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.327657 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.327667 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.429658 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.429695 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.429703 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.429716 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.429726 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.532191 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.532237 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.532249 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.532264 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.532274 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.635095 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.635135 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.635143 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.635155 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.635163 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.734049 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.734067 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:15 crc kubenswrapper[4750]: E1008 18:12:15.734155 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:15 crc kubenswrapper[4750]: E1008 18:12:15.734244 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.737700 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.737756 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.737767 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.737781 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.737790 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.839763 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.839808 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.839822 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.839839 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.839849 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.942729 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.942843 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.942863 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.942887 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:15 crc kubenswrapper[4750]: I1008 18:12:15.942905 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:15Z","lastTransitionTime":"2025-10-08T18:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.045125 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.045164 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.045178 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.045198 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.045214 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.147534 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.147621 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.147637 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.147658 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.147673 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.249922 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.249994 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.250002 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.250015 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.250024 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.351949 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.351984 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.351993 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.352006 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.352016 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.454421 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.454472 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.454482 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.454498 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.454507 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.557830 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.557870 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.557887 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.557902 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.557912 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.659864 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.659904 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.659917 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.659933 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.659944 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.734094 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.734121 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:16 crc kubenswrapper[4750]: E1008 18:12:16.734210 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:16 crc kubenswrapper[4750]: E1008 18:12:16.734330 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.762611 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.762651 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.762660 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.762674 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.762697 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.865287 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.865319 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.865327 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.865340 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.865348 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.968189 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.968224 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.968234 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.968248 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:16 crc kubenswrapper[4750]: I1008 18:12:16.968259 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:16Z","lastTransitionTime":"2025-10-08T18:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.070468 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.070508 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.070523 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.070542 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.070575 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.173287 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.173331 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.173343 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.173359 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.173372 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.275383 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.275430 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.275472 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.275485 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.275493 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.378136 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.378185 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.378194 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.378205 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.378213 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.480857 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.480898 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.480914 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.480982 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.481001 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.583152 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.583184 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.583192 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.583205 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.583213 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.685540 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.685590 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.685600 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.685640 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.685654 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.733807 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.733880 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:17 crc kubenswrapper[4750]: E1008 18:12:17.733944 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:17 crc kubenswrapper[4750]: E1008 18:12:17.734111 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.787780 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.787808 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.787816 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.787829 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.787836 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.890371 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.890408 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.890417 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.890432 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.890443 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.992804 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.992842 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.992849 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.992864 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:17 crc kubenswrapper[4750]: I1008 18:12:17.992875 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:17Z","lastTransitionTime":"2025-10-08T18:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.095236 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.095285 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.095296 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.095312 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.095322 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.197584 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.197639 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.197653 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.197671 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.197685 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.300499 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.300542 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.300569 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.300584 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.300593 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.402392 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.402436 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.402448 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.402465 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.402477 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.504705 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.504762 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.504775 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.504794 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.504806 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.606825 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.606859 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.606871 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.606887 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.606898 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.709504 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.709601 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.709617 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.709642 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.709657 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.733710 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.733809 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:18 crc kubenswrapper[4750]: E1008 18:12:18.733899 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:18 crc kubenswrapper[4750]: E1008 18:12:18.733939 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.812393 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.812432 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.812440 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.812454 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.812469 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.914821 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.914906 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.914919 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.914935 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:18 crc kubenswrapper[4750]: I1008 18:12:18.914946 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:18Z","lastTransitionTime":"2025-10-08T18:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.017500 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.017577 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.017589 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.017632 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.017645 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.120431 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.120472 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.120480 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.120495 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.120504 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.222519 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.222574 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.222585 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.222600 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.222612 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.324586 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.324624 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.324636 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.324651 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.324661 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.427051 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.427091 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.427105 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.427119 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.427130 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.529526 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.529599 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.529610 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.529625 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.529635 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.631872 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.631902 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.631909 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.631921 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.631930 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.733275 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.733330 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:19 crc kubenswrapper[4750]: E1008 18:12:19.733396 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:19 crc kubenswrapper[4750]: E1008 18:12:19.733595 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.734260 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.734278 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.734286 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.734298 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.734306 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.836145 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.836186 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.836205 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.836230 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.836243 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.938421 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.938474 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.938509 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.938530 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:19 crc kubenswrapper[4750]: I1008 18:12:19.938542 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:19Z","lastTransitionTime":"2025-10-08T18:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.041084 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.041117 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.041126 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.041138 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.041191 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.143229 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.143293 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.143307 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.143324 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.143335 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.245390 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.245432 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.245444 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.245462 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.245474 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.348015 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.348053 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.348060 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.348074 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.348083 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.450245 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.450287 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.450298 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.450312 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.450322 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.552588 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.552615 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.552623 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.552636 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.552644 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.654440 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.654505 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.654513 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.654526 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.654534 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.733282 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.733282 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:20 crc kubenswrapper[4750]: E1008 18:12:20.733489 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:20 crc kubenswrapper[4750]: E1008 18:12:20.733414 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.756838 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.756879 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.756893 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.756908 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.756917 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.859106 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.859742 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.859755 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.859991 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.860003 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.962915 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.962966 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.962978 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.962994 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:20 crc kubenswrapper[4750]: I1008 18:12:20.963008 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:20Z","lastTransitionTime":"2025-10-08T18:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.064754 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.064796 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.064806 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.064823 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.064834 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.167569 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.167597 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.167605 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.167635 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.167645 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.269859 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.269894 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.269905 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.269938 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.269949 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.372678 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.372762 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.372776 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.372793 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.372829 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.474956 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.475308 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.475475 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.476037 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.476249 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.579235 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.579317 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.579338 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.579367 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.579397 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.681632 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.681984 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.682159 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.682343 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.682473 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.733987 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.733987 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:21 crc kubenswrapper[4750]: E1008 18:12:21.734400 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:21 crc kubenswrapper[4750]: E1008 18:12:21.734321 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.786453 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.786789 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.786973 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.787160 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.787283 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.890879 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.891680 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.891774 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.891800 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.891816 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.993963 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.994081 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.994101 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.994143 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:21 crc kubenswrapper[4750]: I1008 18:12:21.994159 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:21Z","lastTransitionTime":"2025-10-08T18:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.088598 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:22 crc kubenswrapper[4750]: E1008 18:12:22.088789 4750 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:12:22 crc kubenswrapper[4750]: E1008 18:12:22.088867 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs podName:b67ae9d5-e575-45e7-913a-01f379b86416 nodeName:}" failed. No retries permitted until 2025-10-08 18:13:26.088843396 +0000 UTC m=+162.001814449 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs") pod "network-metrics-daemon-7f9jd" (UID: "b67ae9d5-e575-45e7-913a-01f379b86416") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.096502 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.096540 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.096564 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.096579 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.096591 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.199068 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.199126 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.199143 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.199160 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.199196 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.301538 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.301599 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.301607 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.301621 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.301631 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.404421 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.404463 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.404474 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.404518 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.404534 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.507261 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.507305 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.507313 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.507328 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.507338 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.609600 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.609630 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.609640 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.609652 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.609661 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.712709 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.712759 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.712771 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.712787 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.712799 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.734208 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.734217 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:22 crc kubenswrapper[4750]: E1008 18:12:22.734493 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:22 crc kubenswrapper[4750]: E1008 18:12:22.734790 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.815747 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.815820 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.815843 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.815873 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.815894 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.817418 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.817476 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.817496 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.817520 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.817539 4750 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T18:12:22Z","lastTransitionTime":"2025-10-08T18:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.861869 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g"] Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.862306 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.865247 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.865268 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.865355 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.865408 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.883684 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2x8kt" podStartSLOduration=78.883659942 podStartE2EDuration="1m18.883659942s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:22.883474637 +0000 UTC m=+98.796445640" watchObservedRunningTime="2025-10-08 18:12:22.883659942 +0000 UTC m=+98.796631025" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.918849 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hdvcg" podStartSLOduration=79.918829161 podStartE2EDuration="1m19.918829161s" podCreationTimestamp="2025-10-08 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:22.918648895 +0000 UTC m=+98.831619918" watchObservedRunningTime="2025-10-08 18:12:22.918829161 +0000 UTC m=+98.831800174" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.949581 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.949533161 podStartE2EDuration="51.949533161s" podCreationTimestamp="2025-10-08 18:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:22.949270704 +0000 UTC m=+98.862241727" watchObservedRunningTime="2025-10-08 18:12:22.949533161 +0000 UTC m=+98.862504174" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.949719 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6jln4" podStartSLOduration=79.949713736 podStartE2EDuration="1m19.949713736s" podCreationTimestamp="2025-10-08 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:22.939484009 +0000 UTC m=+98.852455032" watchObservedRunningTime="2025-10-08 18:12:22.949713736 +0000 UTC m=+98.862684749" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.989751 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.9897259 podStartE2EDuration="1m15.9897259s" podCreationTimestamp="2025-10-08 18:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:22.974730348 +0000 UTC m=+98.887701451" watchObservedRunningTime="2025-10-08 18:12:22.9897259 +0000 UTC m=+98.902696953" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.998096 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.998173 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.998204 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.998337 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:22 crc kubenswrapper[4750]: I1008 18:12:22.998441 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.055392 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.055372923 podStartE2EDuration="1m19.055372923s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:23.054963812 +0000 UTC m=+98.967934825" watchObservedRunningTime="2025-10-08 18:12:23.055372923 +0000 UTC m=+98.968343956" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.090673 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.090658603 podStartE2EDuration="1m19.090658603s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:23.0778982 +0000 UTC m=+98.990869213" watchObservedRunningTime="2025-10-08 18:12:23.090658603 +0000 UTC m=+99.003629616" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.099718 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.099868 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.099894 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.099974 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.100033 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.100054 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.100389 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.101149 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.112925 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.116898 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podStartSLOduration=80.116879867 podStartE2EDuration="1m20.116879867s" podCreationTimestamp="2025-10-08 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:23.103788346 +0000 UTC m=+99.016759379" watchObservedRunningTime="2025-10-08 18:12:23.116879867 +0000 UTC m=+99.029850880" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.118098 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0507763-f3b8-4fd6-b2e4-3d89fa91f811-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gdh4g\" (UID: \"c0507763-f3b8-4fd6-b2e4-3d89fa91f811\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.130972 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mhhrt" podStartSLOduration=79.130957385 podStartE2EDuration="1m19.130957385s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:23.117459962 +0000 UTC m=+99.030430975" watchObservedRunningTime="2025-10-08 18:12:23.130957385 +0000 UTC m=+99.043928398" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.131093 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=9.131090078 podStartE2EDuration="9.131090078s" podCreationTimestamp="2025-10-08 18:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:23.129417185 +0000 UTC m=+99.042388198" watchObservedRunningTime="2025-10-08 18:12:23.131090078 +0000 UTC m=+99.044061091" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.167044 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mzb5c" podStartSLOduration=79.167024866 podStartE2EDuration="1m19.167024866s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:23.155840384 +0000 UTC m=+99.068811417" watchObservedRunningTime="2025-10-08 18:12:23.167024866 +0000 UTC m=+99.079995879" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.177689 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" Oct 08 18:12:23 crc kubenswrapper[4750]: W1008 18:12:23.189314 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0507763_f3b8_4fd6_b2e4_3d89fa91f811.slice/crio-355cdb5b2c68e0896139962ac54a10bd4f4e35c5e0cd71debcdca2073a2ba162 WatchSource:0}: Error finding container 355cdb5b2c68e0896139962ac54a10bd4f4e35c5e0cd71debcdca2073a2ba162: Status 404 returned error can't find the container with id 355cdb5b2c68e0896139962ac54a10bd4f4e35c5e0cd71debcdca2073a2ba162 Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.230111 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" event={"ID":"c0507763-f3b8-4fd6-b2e4-3d89fa91f811","Type":"ContainerStarted","Data":"355cdb5b2c68e0896139962ac54a10bd4f4e35c5e0cd71debcdca2073a2ba162"} Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.734091 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:23 crc kubenswrapper[4750]: I1008 18:12:23.734108 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:23 crc kubenswrapper[4750]: E1008 18:12:23.734192 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:23 crc kubenswrapper[4750]: E1008 18:12:23.734252 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:24 crc kubenswrapper[4750]: I1008 18:12:24.233746 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" event={"ID":"c0507763-f3b8-4fd6-b2e4-3d89fa91f811","Type":"ContainerStarted","Data":"197c9eca8536985f53ca03c40b9077f58a5d9cd9b2727b11a49e407f9716aa0b"} Oct 08 18:12:24 crc kubenswrapper[4750]: I1008 18:12:24.246526 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdh4g" podStartSLOduration=80.24650995 podStartE2EDuration="1m20.24650995s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:24.245620057 +0000 UTC m=+100.158591080" watchObservedRunningTime="2025-10-08 18:12:24.24650995 +0000 UTC m=+100.159480963" Oct 08 18:12:24 crc kubenswrapper[4750]: I1008 18:12:24.734211 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:24 crc kubenswrapper[4750]: I1008 18:12:24.734247 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:24 crc kubenswrapper[4750]: E1008 18:12:24.735102 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:24 crc kubenswrapper[4750]: E1008 18:12:24.735414 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:25 crc kubenswrapper[4750]: I1008 18:12:25.733853 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:25 crc kubenswrapper[4750]: I1008 18:12:25.733853 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:25 crc kubenswrapper[4750]: E1008 18:12:25.734152 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:25 crc kubenswrapper[4750]: E1008 18:12:25.734257 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:26 crc kubenswrapper[4750]: I1008 18:12:26.733924 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:26 crc kubenswrapper[4750]: I1008 18:12:26.734055 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:26 crc kubenswrapper[4750]: E1008 18:12:26.734501 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:26 crc kubenswrapper[4750]: E1008 18:12:26.734684 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:26 crc kubenswrapper[4750]: I1008 18:12:26.734951 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:12:26 crc kubenswrapper[4750]: E1008 18:12:26.735233 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rl7f4_openshift-ovn-kubernetes(25d63a44-9fd7-4c19-8715-6ddec94d1806)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" Oct 08 18:12:27 crc kubenswrapper[4750]: I1008 18:12:27.733150 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:27 crc kubenswrapper[4750]: I1008 18:12:27.733159 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:27 crc kubenswrapper[4750]: E1008 18:12:27.733304 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:27 crc kubenswrapper[4750]: E1008 18:12:27.733346 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:28 crc kubenswrapper[4750]: I1008 18:12:28.733929 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:28 crc kubenswrapper[4750]: E1008 18:12:28.734065 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:28 crc kubenswrapper[4750]: I1008 18:12:28.734338 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:28 crc kubenswrapper[4750]: E1008 18:12:28.734483 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:29 crc kubenswrapper[4750]: I1008 18:12:29.734050 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:29 crc kubenswrapper[4750]: I1008 18:12:29.734162 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:29 crc kubenswrapper[4750]: E1008 18:12:29.734290 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:29 crc kubenswrapper[4750]: E1008 18:12:29.734483 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:30 crc kubenswrapper[4750]: I1008 18:12:30.733809 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:30 crc kubenswrapper[4750]: I1008 18:12:30.733849 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:30 crc kubenswrapper[4750]: E1008 18:12:30.733943 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:30 crc kubenswrapper[4750]: E1008 18:12:30.734201 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:31 crc kubenswrapper[4750]: I1008 18:12:31.733283 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:31 crc kubenswrapper[4750]: I1008 18:12:31.733283 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:31 crc kubenswrapper[4750]: E1008 18:12:31.733424 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:31 crc kubenswrapper[4750]: E1008 18:12:31.733546 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:32 crc kubenswrapper[4750]: I1008 18:12:32.734144 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:32 crc kubenswrapper[4750]: I1008 18:12:32.734152 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:32 crc kubenswrapper[4750]: E1008 18:12:32.734330 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:32 crc kubenswrapper[4750]: E1008 18:12:32.734393 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:33 crc kubenswrapper[4750]: I1008 18:12:33.733950 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:33 crc kubenswrapper[4750]: I1008 18:12:33.733998 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:33 crc kubenswrapper[4750]: E1008 18:12:33.734038 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:33 crc kubenswrapper[4750]: E1008 18:12:33.734103 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:34 crc kubenswrapper[4750]: I1008 18:12:34.733595 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:34 crc kubenswrapper[4750]: E1008 18:12:34.734535 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:34 crc kubenswrapper[4750]: I1008 18:12:34.734697 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:34 crc kubenswrapper[4750]: E1008 18:12:34.734927 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:35 crc kubenswrapper[4750]: I1008 18:12:35.734009 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:35 crc kubenswrapper[4750]: I1008 18:12:35.734039 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:35 crc kubenswrapper[4750]: E1008 18:12:35.734258 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:35 crc kubenswrapper[4750]: E1008 18:12:35.734577 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:36 crc kubenswrapper[4750]: I1008 18:12:36.733858 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:36 crc kubenswrapper[4750]: E1008 18:12:36.734443 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:36 crc kubenswrapper[4750]: I1008 18:12:36.734189 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:36 crc kubenswrapper[4750]: E1008 18:12:36.734640 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:37 crc kubenswrapper[4750]: I1008 18:12:37.733804 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:37 crc kubenswrapper[4750]: E1008 18:12:37.734167 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:37 crc kubenswrapper[4750]: I1008 18:12:37.733886 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:37 crc kubenswrapper[4750]: E1008 18:12:37.734372 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:38 crc kubenswrapper[4750]: I1008 18:12:38.278124 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/1.log" Oct 08 18:12:38 crc kubenswrapper[4750]: I1008 18:12:38.278592 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/0.log" Oct 08 18:12:38 crc kubenswrapper[4750]: I1008 18:12:38.278643 4750 generic.go:334] "Generic (PLEG): container finished" podID="cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444" containerID="da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e" exitCode=1 Oct 08 18:12:38 crc kubenswrapper[4750]: I1008 18:12:38.278676 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzb5c" event={"ID":"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444","Type":"ContainerDied","Data":"da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e"} Oct 08 18:12:38 crc kubenswrapper[4750]: I1008 18:12:38.278712 4750 scope.go:117] "RemoveContainer" containerID="bc5b55dd8e3131ba21f9cb51433c01355475968171e092627964cc5a56fabb59" Oct 08 18:12:38 crc kubenswrapper[4750]: I1008 18:12:38.280432 4750 scope.go:117] "RemoveContainer" containerID="da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e" Oct 08 18:12:38 crc kubenswrapper[4750]: E1008 18:12:38.280654 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mzb5c_openshift-multus(cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444)\"" pod="openshift-multus/multus-mzb5c" podUID="cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444" Oct 08 18:12:38 crc kubenswrapper[4750]: I1008 18:12:38.733656 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:38 crc kubenswrapper[4750]: E1008 18:12:38.734102 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:38 crc kubenswrapper[4750]: I1008 18:12:38.733679 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:38 crc kubenswrapper[4750]: E1008 18:12:38.734467 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:39 crc kubenswrapper[4750]: I1008 18:12:39.283755 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/1.log" Oct 08 18:12:39 crc kubenswrapper[4750]: I1008 18:12:39.733178 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:39 crc kubenswrapper[4750]: E1008 18:12:39.733318 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:39 crc kubenswrapper[4750]: I1008 18:12:39.733437 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:39 crc kubenswrapper[4750]: E1008 18:12:39.733716 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:40 crc kubenswrapper[4750]: I1008 18:12:40.733995 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:40 crc kubenswrapper[4750]: I1008 18:12:40.734038 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:40 crc kubenswrapper[4750]: E1008 18:12:40.734234 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:40 crc kubenswrapper[4750]: E1008 18:12:40.734477 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:41 crc kubenswrapper[4750]: I1008 18:12:41.733184 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:41 crc kubenswrapper[4750]: I1008 18:12:41.733213 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:41 crc kubenswrapper[4750]: E1008 18:12:41.733297 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:41 crc kubenswrapper[4750]: E1008 18:12:41.733733 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:41 crc kubenswrapper[4750]: I1008 18:12:41.734138 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:12:42 crc kubenswrapper[4750]: I1008 18:12:42.294590 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/3.log" Oct 08 18:12:42 crc kubenswrapper[4750]: I1008 18:12:42.296903 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerStarted","Data":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} Oct 08 18:12:42 crc kubenswrapper[4750]: I1008 18:12:42.297283 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:12:42 crc kubenswrapper[4750]: I1008 18:12:42.323449 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podStartSLOduration=98.323429862 podStartE2EDuration="1m38.323429862s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:12:42.322442975 +0000 UTC m=+118.235413998" watchObservedRunningTime="2025-10-08 18:12:42.323429862 +0000 UTC m=+118.236400895" Oct 08 18:12:42 crc kubenswrapper[4750]: I1008 18:12:42.615962 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7f9jd"] Oct 08 18:12:42 crc kubenswrapper[4750]: I1008 18:12:42.616078 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:42 crc kubenswrapper[4750]: E1008 18:12:42.616175 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:42 crc kubenswrapper[4750]: I1008 18:12:42.762413 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:42 crc kubenswrapper[4750]: E1008 18:12:42.762752 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:42 crc kubenswrapper[4750]: I1008 18:12:42.762578 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:42 crc kubenswrapper[4750]: E1008 18:12:42.762827 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:43 crc kubenswrapper[4750]: I1008 18:12:43.733812 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:43 crc kubenswrapper[4750]: E1008 18:12:43.733952 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:44 crc kubenswrapper[4750]: E1008 18:12:44.707411 4750 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 08 18:12:44 crc kubenswrapper[4750]: I1008 18:12:44.734231 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:44 crc kubenswrapper[4750]: I1008 18:12:44.734341 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:44 crc kubenswrapper[4750]: I1008 18:12:44.734475 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:44 crc kubenswrapper[4750]: E1008 18:12:44.735255 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:44 crc kubenswrapper[4750]: E1008 18:12:44.735603 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:44 crc kubenswrapper[4750]: E1008 18:12:44.735679 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:44 crc kubenswrapper[4750]: E1008 18:12:44.860454 4750 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 18:12:45 crc kubenswrapper[4750]: I1008 18:12:45.733698 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:45 crc kubenswrapper[4750]: E1008 18:12:45.733832 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:46 crc kubenswrapper[4750]: I1008 18:12:46.733822 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:46 crc kubenswrapper[4750]: I1008 18:12:46.733904 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:46 crc kubenswrapper[4750]: I1008 18:12:46.734023 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:46 crc kubenswrapper[4750]: E1008 18:12:46.734015 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:46 crc kubenswrapper[4750]: E1008 18:12:46.734172 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:46 crc kubenswrapper[4750]: E1008 18:12:46.734339 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:47 crc kubenswrapper[4750]: I1008 18:12:47.733166 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:47 crc kubenswrapper[4750]: E1008 18:12:47.733350 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:48 crc kubenswrapper[4750]: I1008 18:12:48.733724 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:48 crc kubenswrapper[4750]: I1008 18:12:48.733798 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:48 crc kubenswrapper[4750]: I1008 18:12:48.733744 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:48 crc kubenswrapper[4750]: E1008 18:12:48.733891 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:48 crc kubenswrapper[4750]: E1008 18:12:48.733996 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:48 crc kubenswrapper[4750]: E1008 18:12:48.734115 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:49 crc kubenswrapper[4750]: I1008 18:12:49.733584 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:49 crc kubenswrapper[4750]: E1008 18:12:49.733709 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:49 crc kubenswrapper[4750]: E1008 18:12:49.861452 4750 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 18:12:50 crc kubenswrapper[4750]: I1008 18:12:50.733569 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:50 crc kubenswrapper[4750]: I1008 18:12:50.733606 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:50 crc kubenswrapper[4750]: I1008 18:12:50.733581 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:50 crc kubenswrapper[4750]: E1008 18:12:50.733696 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:50 crc kubenswrapper[4750]: E1008 18:12:50.733848 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:50 crc kubenswrapper[4750]: E1008 18:12:50.733873 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:51 crc kubenswrapper[4750]: I1008 18:12:51.734030 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:51 crc kubenswrapper[4750]: E1008 18:12:51.734323 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:52 crc kubenswrapper[4750]: I1008 18:12:52.733627 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:52 crc kubenswrapper[4750]: I1008 18:12:52.733736 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:52 crc kubenswrapper[4750]: I1008 18:12:52.733796 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:52 crc kubenswrapper[4750]: E1008 18:12:52.734296 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:52 crc kubenswrapper[4750]: E1008 18:12:52.734741 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:52 crc kubenswrapper[4750]: E1008 18:12:52.734833 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:53 crc kubenswrapper[4750]: I1008 18:12:53.733422 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:53 crc kubenswrapper[4750]: E1008 18:12:53.733723 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:53 crc kubenswrapper[4750]: I1008 18:12:53.733924 4750 scope.go:117] "RemoveContainer" containerID="da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e" Oct 08 18:12:54 crc kubenswrapper[4750]: I1008 18:12:54.336864 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/1.log" Oct 08 18:12:54 crc kubenswrapper[4750]: I1008 18:12:54.337137 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzb5c" event={"ID":"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444","Type":"ContainerStarted","Data":"848a1d5ba28488cc14f38af4ab07698a61b006fdb10eea6b5fd3da909bf89bdc"} Oct 08 18:12:54 crc kubenswrapper[4750]: I1008 18:12:54.733506 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:54 crc kubenswrapper[4750]: I1008 18:12:54.733670 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:54 crc kubenswrapper[4750]: E1008 18:12:54.735667 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:54 crc kubenswrapper[4750]: I1008 18:12:54.735707 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:54 crc kubenswrapper[4750]: E1008 18:12:54.735829 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:54 crc kubenswrapper[4750]: E1008 18:12:54.735878 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:54 crc kubenswrapper[4750]: E1008 18:12:54.862267 4750 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 18:12:55 crc kubenswrapper[4750]: I1008 18:12:55.733699 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:55 crc kubenswrapper[4750]: E1008 18:12:55.733948 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:56 crc kubenswrapper[4750]: I1008 18:12:56.733614 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:56 crc kubenswrapper[4750]: I1008 18:12:56.733723 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:56 crc kubenswrapper[4750]: I1008 18:12:56.733919 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:56 crc kubenswrapper[4750]: E1008 18:12:56.733898 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:56 crc kubenswrapper[4750]: E1008 18:12:56.734092 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:56 crc kubenswrapper[4750]: E1008 18:12:56.734152 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:57 crc kubenswrapper[4750]: I1008 18:12:57.733413 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:57 crc kubenswrapper[4750]: E1008 18:12:57.733542 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:12:58 crc kubenswrapper[4750]: I1008 18:12:58.733561 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:12:58 crc kubenswrapper[4750]: I1008 18:12:58.733642 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:12:58 crc kubenswrapper[4750]: I1008 18:12:58.733578 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:12:58 crc kubenswrapper[4750]: E1008 18:12:58.733729 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7f9jd" podUID="b67ae9d5-e575-45e7-913a-01f379b86416" Oct 08 18:12:58 crc kubenswrapper[4750]: E1008 18:12:58.733684 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 18:12:58 crc kubenswrapper[4750]: E1008 18:12:58.733913 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 18:12:59 crc kubenswrapper[4750]: I1008 18:12:59.733201 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:12:59 crc kubenswrapper[4750]: E1008 18:12:59.733321 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.733690 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.733744 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.733709 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.736027 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.736347 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.738585 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.739066 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.739185 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 08 18:13:00 crc kubenswrapper[4750]: I1008 18:13:00.739901 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 08 18:13:01 crc kubenswrapper[4750]: I1008 18:13:01.734103 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.165283 4750 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.194258 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4zxlq"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.194632 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.198398 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4r4tz"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.198698 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vwt7h"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.198994 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.199182 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.199604 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.199859 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.200430 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.200755 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.200969 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.201139 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.201256 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.202457 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.208120 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.211757 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.212053 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.212260 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.212517 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.213159 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.213363 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.220178 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-62m4q"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.220365 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.220505 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.220891 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.220919 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.221054 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.221565 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.221931 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.222220 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.222420 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.222447 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.222529 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.222911 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.222991 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.223129 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fzfgf"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.223220 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.223222 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.223386 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.223425 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.223466 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.224222 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qt2tv"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.224520 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.224983 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.225292 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.225691 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.226343 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hzx59"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.226976 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.228640 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.232364 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.232628 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.232841 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.232958 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.233441 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.233575 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.235440 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.235766 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.236493 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wprnh"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.236977 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jw49l"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.237278 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bvwfn"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.237466 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.239688 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.239926 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.240067 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.240066 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jw49l" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.241364 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.241539 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.241403 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.242035 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.242244 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.242455 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.242565 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.242673 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.242907 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.242949 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243042 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243103 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243213 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243354 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243405 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243466 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243633 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243803 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243983 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.243997 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.244108 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.244229 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.241911 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.244649 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.242912 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.244759 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.244362 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.244932 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.245212 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.245286 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-59hsn"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.245348 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.245755 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.246179 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.246435 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.246652 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.247250 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2cxc5"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.247768 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248053 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248072 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248142 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248214 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248226 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248294 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248362 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248390 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248432 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248498 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248784 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248813 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248886 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248947 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.248956 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.249014 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.249026 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.249097 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.249140 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.249182 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.249249 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.262638 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.267310 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n57nw"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.268199 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.270228 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.275712 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.275942 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.277537 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.275538 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.277952 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.278046 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.278080 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.278155 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.278274 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.278567 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.278724 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.278966 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.280261 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.280465 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.280987 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.281782 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.284157 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.285718 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.287942 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.290243 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.291162 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.291439 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.295131 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.295657 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.296752 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.297337 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.297665 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.298656 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.298879 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299162 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299626 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299687 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a37070c-45cb-4989-9f51-b229609be506-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299736 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfr2\" (UniqueName: \"kubernetes.io/projected/8b2784aa-3090-4fd9-a1cb-28b99b181a64-kube-api-access-xwfr2\") pod \"dns-operator-744455d44c-59hsn\" (UID: \"8b2784aa-3090-4fd9-a1cb-28b99b181a64\") " pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299765 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-ca\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299783 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299788 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-config\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299839 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d140a287-5d7c-4379-bc27-a6156ff3379d-config\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299865 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-544xp\" (UniqueName: \"kubernetes.io/projected/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-kube-api-access-544xp\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299914 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-dir\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.299959 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0696be9-7ac8-46ce-9839-f74ae867f317-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300009 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhhjl\" (UniqueName: \"kubernetes.io/projected/1a37070c-45cb-4989-9f51-b229609be506-kube-api-access-nhhjl\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300034 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-service-ca\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300082 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300109 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300166 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d140a287-5d7c-4379-bc27-a6156ff3379d-machine-approver-tls\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300191 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmbqc\" (UniqueName: \"kubernetes.io/projected/d140a287-5d7c-4379-bc27-a6156ff3379d-kube-api-access-wmbqc\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300251 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b2784aa-3090-4fd9-a1cb-28b99b181a64-metrics-tls\") pod \"dns-operator-744455d44c-59hsn\" (UID: \"8b2784aa-3090-4fd9-a1cb-28b99b181a64\") " pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300278 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll7hm\" (UniqueName: \"kubernetes.io/projected/bbde5089-5fb0-4faf-9788-21ca7048e6f2-kube-api-access-ll7hm\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300327 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0696be9-7ac8-46ce-9839-f74ae867f317-trusted-ca\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300350 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6530eb5-a257-4381-a176-b0e0972181ac-serving-cert\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300399 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6mqt\" (UniqueName: \"kubernetes.io/projected/84dd6602-6fa4-4615-a79c-d571fec1a58c-kube-api-access-q6mqt\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300422 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-policies\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300443 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300485 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fcm\" (UniqueName: \"kubernetes.io/projected/a6530eb5-a257-4381-a176-b0e0972181ac-kube-api-access-t5fcm\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300524 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300590 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca846a3-8fac-4636-8a95-5c3b877d3477-serving-cert\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300614 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0696be9-7ac8-46ce-9839-f74ae867f317-metrics-tls\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300634 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a37070c-45cb-4989-9f51-b229609be506-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300677 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84dd6602-6fa4-4615-a79c-d571fec1a58c-trusted-ca\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300699 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300721 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1ca846a3-8fac-4636-8a95-5c3b877d3477-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300774 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-config\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300798 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjxt8\" (UniqueName: \"kubernetes.io/projected/88aafa5f-15c0-43af-80be-1c01d844c9c9-kube-api-access-tjxt8\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300864 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300885 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-images\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300929 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m262\" (UniqueName: \"kubernetes.io/projected/8d1619a1-f1c5-455f-814e-0cff00e053c0-kube-api-access-8m262\") pod \"downloads-7954f5f757-jw49l\" (UID: \"8d1619a1-f1c5-455f-814e-0cff00e053c0\") " pod="openshift-console/downloads-7954f5f757-jw49l" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300951 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84dd6602-6fa4-4615-a79c-d571fec1a58c-config\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.300992 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301012 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301032 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301066 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-client\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301084 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-config\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301102 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbde5089-5fb0-4faf-9788-21ca7048e6f2-serving-cert\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301119 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvw42\" (UniqueName: \"kubernetes.io/projected/d0696be9-7ac8-46ce-9839-f74ae867f317-kube-api-access-qvw42\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301149 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs72w\" (UniqueName: \"kubernetes.io/projected/1ca846a3-8fac-4636-8a95-5c3b877d3477-kube-api-access-vs72w\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301166 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-client-ca\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301181 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d140a287-5d7c-4379-bc27-a6156ff3379d-auth-proxy-config\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301196 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84dd6602-6fa4-4615-a79c-d571fec1a58c-serving-cert\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301225 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.301252 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.302034 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.302617 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.303744 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xj8d6"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.304457 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.304983 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.313791 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.314538 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.315119 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.315962 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.316258 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.316435 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.322481 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.323461 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kjc7k"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.324090 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.324355 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.324703 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.328750 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.338113 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tdftm"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.340172 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg6dd"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.341834 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.343357 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bvwfn"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.343383 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4zxlq"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.344994 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.345436 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.345642 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.345799 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.351269 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qt2tv"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.362363 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.362951 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.363631 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.363895 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.363912 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4r4tz"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.364853 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.366155 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vwt7h"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.367406 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-62m4q"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.370048 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2cxc5"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.370313 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fzfgf"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.372331 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.375245 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.376402 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.376495 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.377340 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jw49l"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.380716 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.380826 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.380910 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.383422 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.384516 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wprnh"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.385947 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.387082 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hzx59"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.388624 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg6dd"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.391655 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-59hsn"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.392279 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.394092 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n57nw"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.395650 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.396360 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.397587 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.398641 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n87fc"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.399940 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mfdcf"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.400246 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401263 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401528 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401662 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-config\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401695 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjxt8\" (UniqueName: \"kubernetes.io/projected/88aafa5f-15c0-43af-80be-1c01d844c9c9-kube-api-access-tjxt8\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401723 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1ca846a3-8fac-4636-8a95-5c3b877d3477-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401750 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401772 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-images\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401798 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84dd6602-6fa4-4615-a79c-d571fec1a58c-config\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401820 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401837 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m262\" (UniqueName: \"kubernetes.io/projected/8d1619a1-f1c5-455f-814e-0cff00e053c0-kube-api-access-8m262\") pod \"downloads-7954f5f757-jw49l\" (UID: \"8d1619a1-f1c5-455f-814e-0cff00e053c0\") " pod="openshift-console/downloads-7954f5f757-jw49l" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401853 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401871 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401889 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-client\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401906 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-config\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401920 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbde5089-5fb0-4faf-9788-21ca7048e6f2-serving-cert\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401937 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvw42\" (UniqueName: \"kubernetes.io/projected/d0696be9-7ac8-46ce-9839-f74ae867f317-kube-api-access-qvw42\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401955 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs72w\" (UniqueName: \"kubernetes.io/projected/1ca846a3-8fac-4636-8a95-5c3b877d3477-kube-api-access-vs72w\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401979 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-client-ca\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.401998 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d140a287-5d7c-4379-bc27-a6156ff3379d-auth-proxy-config\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402013 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84dd6602-6fa4-4615-a79c-d571fec1a58c-serving-cert\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402035 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402069 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402090 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402107 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-ca\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402122 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-config\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402141 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a37070c-45cb-4989-9f51-b229609be506-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402166 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfr2\" (UniqueName: \"kubernetes.io/projected/8b2784aa-3090-4fd9-a1cb-28b99b181a64-kube-api-access-xwfr2\") pod \"dns-operator-744455d44c-59hsn\" (UID: \"8b2784aa-3090-4fd9-a1cb-28b99b181a64\") " pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402188 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d140a287-5d7c-4379-bc27-a6156ff3379d-config\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402205 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-544xp\" (UniqueName: \"kubernetes.io/projected/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-kube-api-access-544xp\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402221 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-dir\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402252 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0696be9-7ac8-46ce-9839-f74ae867f317-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402266 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-service-ca\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402282 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402300 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402316 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhhjl\" (UniqueName: \"kubernetes.io/projected/1a37070c-45cb-4989-9f51-b229609be506-kube-api-access-nhhjl\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402333 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d140a287-5d7c-4379-bc27-a6156ff3379d-machine-approver-tls\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402348 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmbqc\" (UniqueName: \"kubernetes.io/projected/d140a287-5d7c-4379-bc27-a6156ff3379d-kube-api-access-wmbqc\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402369 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-config\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402373 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll7hm\" (UniqueName: \"kubernetes.io/projected/bbde5089-5fb0-4faf-9788-21ca7048e6f2-kube-api-access-ll7hm\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402413 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1ca846a3-8fac-4636-8a95-5c3b877d3477-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402499 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402600 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0696be9-7ac8-46ce-9839-f74ae867f317-trusted-ca\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402631 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b2784aa-3090-4fd9-a1cb-28b99b181a64-metrics-tls\") pod \"dns-operator-744455d44c-59hsn\" (UID: \"8b2784aa-3090-4fd9-a1cb-28b99b181a64\") " pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402657 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6mqt\" (UniqueName: \"kubernetes.io/projected/84dd6602-6fa4-4615-a79c-d571fec1a58c-kube-api-access-q6mqt\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402760 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-policies\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402847 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402870 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6530eb5-a257-4381-a176-b0e0972181ac-serving-cert\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.402903 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403026 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca846a3-8fac-4636-8a95-5c3b877d3477-serving-cert\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403049 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fcm\" (UniqueName: \"kubernetes.io/projected/a6530eb5-a257-4381-a176-b0e0972181ac-kube-api-access-t5fcm\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403075 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0696be9-7ac8-46ce-9839-f74ae867f317-metrics-tls\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403080 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d140a287-5d7c-4379-bc27-a6156ff3379d-config\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403102 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a37070c-45cb-4989-9f51-b229609be506-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403131 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-dir\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403162 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84dd6602-6fa4-4615-a79c-d571fec1a58c-trusted-ca\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403190 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403521 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.403909 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.404028 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-ca\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.404564 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-service-ca\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.404759 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-images\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.404845 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-client-ca\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.404965 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.405307 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-config\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.405362 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d140a287-5d7c-4379-bc27-a6156ff3379d-auth-proxy-config\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.406243 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.406285 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-policies\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.406912 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.408276 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.409467 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84dd6602-6fa4-4615-a79c-d571fec1a58c-trusted-ca\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.409652 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84dd6602-6fa4-4615-a79c-d571fec1a58c-config\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.410413 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xj8d6"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.411019 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84dd6602-6fa4-4615-a79c-d571fec1a58c-serving-cert\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.412022 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6530eb5-a257-4381-a176-b0e0972181ac-serving-cert\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.413254 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.413287 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kjc7k"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.413406 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbde5089-5fb0-4faf-9788-21ca7048e6f2-etcd-client\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.413689 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.415033 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-config\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.415060 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.415188 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca846a3-8fac-4636-8a95-5c3b877d3477-serving-cert\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.415272 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n87fc"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.416354 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.416360 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.416984 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.418057 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.418360 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.418836 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.418909 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.420019 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.421557 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wtkzp"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.423187 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wtkzp" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.425941 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.426906 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d140a287-5d7c-4379-bc27-a6156ff3379d-machine-approver-tls\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.427287 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbde5089-5fb0-4faf-9788-21ca7048e6f2-serving-cert\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.427414 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wtkzp"] Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.438589 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.456727 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.466688 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a37070c-45cb-4989-9f51-b229609be506-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.476780 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.478999 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a37070c-45cb-4989-9f51-b229609be506-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.496934 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.499734 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b2784aa-3090-4fd9-a1cb-28b99b181a64-metrics-tls\") pod \"dns-operator-744455d44c-59hsn\" (UID: \"8b2784aa-3090-4fd9-a1cb-28b99b181a64\") " pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.517230 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.536342 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.556565 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.578057 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.596109 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.617097 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.636809 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.656878 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.677022 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.696970 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.716714 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.736244 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.756667 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.776295 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.797580 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.818088 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.836986 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.847147 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0696be9-7ac8-46ce-9839-f74ae867f317-metrics-tls\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.862855 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.867258 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0696be9-7ac8-46ce-9839-f74ae867f317-trusted-ca\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.876952 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.898959 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.916905 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.937885 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.956950 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.977234 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 08 18:13:03 crc kubenswrapper[4750]: I1008 18:13:03.997302 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.036083 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.077157 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.096906 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.116494 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.137502 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.157045 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.177687 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.195981 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.218748 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.236452 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.256506 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.276392 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.296997 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.315267 4750 request.go:700] Waited for 1.016289228s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackage-server-manager-serving-cert&limit=500&resourceVersion=0 Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.317067 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.336693 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.357820 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.377131 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.397517 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.417345 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.437219 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.457331 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.477050 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.497377 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.517043 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.538336 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.556452 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.576027 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.597127 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.616696 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.636529 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.656100 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.676810 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.697004 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.716651 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.737301 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.757416 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.777726 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.796315 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.816975 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.836650 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.856603 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.877902 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.896967 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.915896 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.937158 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.956724 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.982881 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 08 18:13:04 crc kubenswrapper[4750]: I1008 18:13:04.996863 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.016929 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.037609 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.056916 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.077249 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.120304 4750 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.120664 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.151883 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvw42\" (UniqueName: \"kubernetes.io/projected/d0696be9-7ac8-46ce-9839-f74ae867f317-kube-api-access-qvw42\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.170379 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-544xp\" (UniqueName: \"kubernetes.io/projected/a20bcaf3-28dc-41f5-8c3e-25b3dfad8202-kube-api-access-544xp\") pod \"machine-api-operator-5694c8668f-4r4tz\" (UID: \"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.189150 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmbqc\" (UniqueName: \"kubernetes.io/projected/d140a287-5d7c-4379-bc27-a6156ff3379d-kube-api-access-wmbqc\") pod \"machine-approver-56656f9798-bk84c\" (UID: \"d140a287-5d7c-4379-bc27-a6156ff3379d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.190983 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.212043 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhhjl\" (UniqueName: \"kubernetes.io/projected/1a37070c-45cb-4989-9f51-b229609be506-kube-api-access-nhhjl\") pod \"openshift-controller-manager-operator-756b6f6bc6-v6gc8\" (UID: \"1a37070c-45cb-4989-9f51-b229609be506\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.217609 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.257576 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.259071 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fcm\" (UniqueName: \"kubernetes.io/projected/a6530eb5-a257-4381-a176-b0e0972181ac-kube-api-access-t5fcm\") pod \"route-controller-manager-6576b87f9c-69h7b\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.291480 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6mqt\" (UniqueName: \"kubernetes.io/projected/84dd6602-6fa4-4615-a79c-d571fec1a58c-kube-api-access-q6mqt\") pod \"console-operator-58897d9998-fzfgf\" (UID: \"84dd6602-6fa4-4615-a79c-d571fec1a58c\") " pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.295908 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.315496 4750 request.go:700] Waited for 1.911258596s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.317199 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll7hm\" (UniqueName: \"kubernetes.io/projected/bbde5089-5fb0-4faf-9788-21ca7048e6f2-kube-api-access-ll7hm\") pod \"etcd-operator-b45778765-bvwfn\" (UID: \"bbde5089-5fb0-4faf-9788-21ca7048e6f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.339235 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs72w\" (UniqueName: \"kubernetes.io/projected/1ca846a3-8fac-4636-8a95-5c3b877d3477-kube-api-access-vs72w\") pod \"openshift-config-operator-7777fb866f-hzx59\" (UID: \"1ca846a3-8fac-4636-8a95-5c3b877d3477\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.362702 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjxt8\" (UniqueName: \"kubernetes.io/projected/88aafa5f-15c0-43af-80be-1c01d844c9c9-kube-api-access-tjxt8\") pod \"oauth-openshift-558db77b4-62m4q\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.382947 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.387521 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" event={"ID":"d140a287-5d7c-4379-bc27-a6156ff3379d","Type":"ContainerStarted","Data":"26d7d415b85cbf9dc91da66a272bf7ad1abe6d3b5674ea63300e5501976d6919"} Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.403664 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m262\" (UniqueName: \"kubernetes.io/projected/8d1619a1-f1c5-455f-814e-0cff00e053c0-kube-api-access-8m262\") pod \"downloads-7954f5f757-jw49l\" (UID: \"8d1619a1-f1c5-455f-814e-0cff00e053c0\") " pod="openshift-console/downloads-7954f5f757-jw49l" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.411602 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0696be9-7ac8-46ce-9839-f74ae867f317-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4h2qz\" (UID: \"d0696be9-7ac8-46ce-9839-f74ae867f317\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.416800 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.418954 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfr2\" (UniqueName: \"kubernetes.io/projected/8b2784aa-3090-4fd9-a1cb-28b99b181a64-kube-api-access-xwfr2\") pod \"dns-operator-744455d44c-59hsn\" (UID: \"8b2784aa-3090-4fd9-a1cb-28b99b181a64\") " pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.438404 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.452856 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.458391 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.478798 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.488023 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.503827 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528003 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-config\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528398 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528487 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9qts\" (UniqueName: \"kubernetes.io/projected/80400620-75a2-4d87-ad1a-2ef29346babc-kube-api-access-j9qts\") pod \"migrator-59844c95c7-rdtt7\" (UID: \"80400620-75a2-4d87-ad1a-2ef29346babc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528577 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5408b201-966b-4e3e-b681-faaa2f32d633-config\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528641 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2508573-4890-48b6-9119-93560ee4c5d9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528673 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2254728a-2ece-43b6-8b5f-a16970277d33-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528720 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5408b201-966b-4e3e-b681-faaa2f32d633-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528749 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5408b201-966b-4e3e-b681-faaa2f32d633-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528773 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcm5r\" (UniqueName: \"kubernetes.io/projected/7c3552dc-a0cf-4072-91e1-030803f6014d-kube-api-access-tcm5r\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528827 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-encryption-config\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.528950 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsm5j\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-kube-api-access-jsm5j\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529017 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529493 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-serving-cert\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529595 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hp5m\" (UniqueName: \"kubernetes.io/projected/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-kube-api-access-5hp5m\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529640 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-audit-policies\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529669 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bzw6\" (UniqueName: \"kubernetes.io/projected/90f5c875-3507-453d-9901-fd60a1476f71-kube-api-access-5bzw6\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529699 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-serving-cert\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529729 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-trusted-ca-bundle\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529765 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db0361d2-e90e-43d1-8ea7-29c7167d4d05-serving-cert\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529797 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f54243-6144-4ea9-acc6-b3f3c580e037-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529841 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krxft\" (UniqueName: \"kubernetes.io/projected/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-kube-api-access-krxft\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529895 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529930 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.529965 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-oauth-config\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530003 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2508573-4890-48b6-9119-93560ee4c5d9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530075 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-config\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530110 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-service-ca-bundle\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530142 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w8527\" (UID: \"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530179 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-etcd-serving-ca\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530210 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-registry-tls\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530263 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjrrc\" (UniqueName: \"kubernetes.io/projected/4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60-kube-api-access-bjrrc\") pod \"cluster-samples-operator-665b6dd947-w8527\" (UID: \"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530294 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77eda81d-119c-4876-a951-e7262ed136b9-node-pullsecrets\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530323 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-audit\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530352 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-etcd-client\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530390 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-oauth-serving-cert\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530483 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f53f5c28-7650-4388-981f-4ae86cbf8068-config-volume\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530541 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znbhg\" (UniqueName: \"kubernetes.io/projected/f53f5c28-7650-4388-981f-4ae86cbf8068-kube-api-access-znbhg\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530595 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-registry-certificates\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530621 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530646 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqgpr\" (UniqueName: \"kubernetes.io/projected/95f54243-6144-4ea9-acc6-b3f3c580e037-kube-api-access-dqgpr\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530684 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2254728a-2ece-43b6-8b5f-a16970277d33-config\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530722 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-etcd-client\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530740 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530756 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-console-config\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530778 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-666l9\" (UniqueName: \"kubernetes.io/projected/db0361d2-e90e-43d1-8ea7-29c7167d4d05-kube-api-access-666l9\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530796 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f5c875-3507-453d-9901-fd60a1476f71-serving-cert\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530815 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77eda81d-119c-4876-a951-e7262ed136b9-audit-dir\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530845 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530867 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-encryption-config\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530893 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-trusted-ca\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530913 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530932 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-audit-dir\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530956 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqhs4\" (UniqueName: \"kubernetes.io/projected/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-kube-api-access-lqhs4\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530978 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-image-import-ca\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.530999 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f54243-6144-4ea9-acc6-b3f3c580e037-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531017 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-client-ca\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531035 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-config\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531059 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-bound-sa-token\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531078 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2254728a-2ece-43b6-8b5f-a16970277d33-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531098 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-service-ca\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531122 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531149 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531168 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: E1008 18:13:05.531188 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.031171807 +0000 UTC m=+141.944142820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531216 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f53f5c28-7650-4388-981f-4ae86cbf8068-metrics-tls\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531241 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fplk6\" (UniqueName: \"kubernetes.io/projected/77eda81d-119c-4876-a951-e7262ed136b9-kube-api-access-fplk6\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.531265 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-serving-cert\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.556930 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.575864 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.584813 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jw49l" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.609425 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.613821 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4r4tz"] Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632059 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632273 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632300 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c5bf2447-4038-4736-999a-f5b87905a99a-images\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632316 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-config\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632345 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znbhg\" (UniqueName: \"kubernetes.io/projected/f53f5c28-7650-4388-981f-4ae86cbf8068-kube-api-access-znbhg\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632367 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-registry-certificates\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632384 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqgpr\" (UniqueName: \"kubernetes.io/projected/95f54243-6144-4ea9-acc6-b3f3c580e037-kube-api-access-dqgpr\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632399 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-metrics-certs\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632439 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9fa6043a-28e1-436e-9d43-f4f37c431457-srv-cert\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632455 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kts5r\" (UniqueName: \"kubernetes.io/projected/988fa55b-c750-4c47-aaf6-fb602e6477d7-kube-api-access-kts5r\") pod \"multus-admission-controller-857f4d67dd-xj8d6\" (UID: \"988fa55b-c750-4c47-aaf6-fb602e6477d7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632484 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-etcd-client\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632501 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-console-config\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632526 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-666l9\" (UniqueName: \"kubernetes.io/projected/db0361d2-e90e-43d1-8ea7-29c7167d4d05-kube-api-access-666l9\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632541 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wj2h\" (UniqueName: \"kubernetes.io/projected/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-kube-api-access-8wj2h\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632580 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/55eee756-293d-4ca8-91d0-6f95f22e72dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fsv6k\" (UID: \"55eee756-293d-4ca8-91d0-6f95f22e72dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632606 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-encryption-config\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632625 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e522b21c-5710-4f2e-bff5-b76226c88d2f-tmpfs\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632659 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-trusted-ca\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632677 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqhs4\" (UniqueName: \"kubernetes.io/projected/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-kube-api-access-lqhs4\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632698 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-image-import-ca\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632719 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wprks\" (UniqueName: \"kubernetes.io/projected/59a547a6-ce7d-41ea-a985-0e477deec34a-kube-api-access-wprks\") pod \"ingress-canary-wtkzp\" (UID: \"59a547a6-ce7d-41ea-a985-0e477deec34a\") " pod="openshift-ingress-canary/ingress-canary-wtkzp" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632735 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-bound-sa-token\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632749 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-service-ca\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632766 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/19b57874-b782-45bf-a257-3a162023a242-signing-key\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632794 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632809 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhkq\" (UniqueName: \"kubernetes.io/projected/884c29cd-b721-400c-b319-510f191f02dd-kube-api-access-xwhkq\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632833 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-stats-auth\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632858 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632874 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9fa6043a-28e1-436e-9d43-f4f37c431457-profile-collector-cert\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632890 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a547a6-ce7d-41ea-a985-0e477deec34a-cert\") pod \"ingress-canary-wtkzp\" (UID: \"59a547a6-ce7d-41ea-a985-0e477deec34a\") " pod="openshift-ingress-canary/ingress-canary-wtkzp" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632906 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fplk6\" (UniqueName: \"kubernetes.io/projected/77eda81d-119c-4876-a951-e7262ed136b9-kube-api-access-fplk6\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632921 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-serving-cert\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632935 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632949 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9qts\" (UniqueName: \"kubernetes.io/projected/80400620-75a2-4d87-ad1a-2ef29346babc-kube-api-access-j9qts\") pod \"migrator-59844c95c7-rdtt7\" (UID: \"80400620-75a2-4d87-ad1a-2ef29346babc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632968 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b594b560-3db7-4be0-8799-f6eeac7d3aeb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.632986 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5408b201-966b-4e3e-b681-faaa2f32d633-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633074 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633119 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-default-certificate\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633137 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsm5j\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-kube-api-access-jsm5j\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633160 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-serving-cert\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633175 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b594b560-3db7-4be0-8799-f6eeac7d3aeb-srv-cert\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633192 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hp5m\" (UniqueName: \"kubernetes.io/projected/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-kube-api-access-5hp5m\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633219 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db0361d2-e90e-43d1-8ea7-29c7167d4d05-serving-cert\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633235 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-mountpoint-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633254 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-csi-data-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633271 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633288 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-829vf\" (UniqueName: \"kubernetes.io/projected/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-kube-api-access-829vf\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633304 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-service-ca-bundle\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633320 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w8527\" (UID: \"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633337 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-config\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633371 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-registry-tls\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633394 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77eda81d-119c-4876-a951-e7262ed136b9-node-pullsecrets\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633409 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-audit\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633424 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-etcd-client\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633449 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633463 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-proxy-tls\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633479 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-oauth-serving-cert\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633499 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z27j6\" (UniqueName: \"kubernetes.io/projected/ce206052-32b9-4632-a7d7-4bd6630a6fd5-kube-api-access-z27j6\") pod \"package-server-manager-789f6589d5-5ff9w\" (UID: \"ce206052-32b9-4632-a7d7-4bd6630a6fd5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633514 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/19b57874-b782-45bf-a257-3a162023a242-signing-cabundle\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633563 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f53f5c28-7650-4388-981f-4ae86cbf8068-config-volume\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633611 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-secret-volume\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633629 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633649 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2254728a-2ece-43b6-8b5f-a16970277d33-config\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633667 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e522b21c-5710-4f2e-bff5-b76226c88d2f-webhook-cert\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633699 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08bc2767-2a02-4bc1-b3cd-47670db11792-service-ca-bundle\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633716 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633740 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f5c875-3507-453d-9901-fd60a1476f71-serving-cert\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633756 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77eda81d-119c-4876-a951-e7262ed136b9-audit-dir\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633772 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/884c29cd-b721-400c-b319-510f191f02dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633787 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-registration-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633811 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5bf2447-4038-4736-999a-f5b87905a99a-proxy-tls\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633833 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4tw2\" (UniqueName: \"kubernetes.io/projected/55eee756-293d-4ca8-91d0-6f95f22e72dc-kube-api-access-s4tw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fsv6k\" (UID: \"55eee756-293d-4ca8-91d0-6f95f22e72dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633850 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/988fa55b-c750-4c47-aaf6-fb602e6477d7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xj8d6\" (UID: \"988fa55b-c750-4c47-aaf6-fb602e6477d7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633885 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633901 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-audit-dir\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633915 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884c29cd-b721-400c-b319-510f191f02dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633933 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-client-ca\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633948 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-config\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633964 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/354ca34c-c524-46db-91e6-7e99e8e1eebe-node-bootstrap-token\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633978 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-plugins-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.633994 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f54243-6144-4ea9-acc6-b3f3c580e037-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634020 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2254728a-2ece-43b6-8b5f-a16970277d33-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634038 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5bf2447-4038-4736-999a-f5b87905a99a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634053 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77mrd\" (UniqueName: \"kubernetes.io/projected/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-kube-api-access-77mrd\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634101 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/354ca34c-c524-46db-91e6-7e99e8e1eebe-certs\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634138 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76gxm\" (UniqueName: \"kubernetes.io/projected/9fa6043a-28e1-436e-9d43-f4f37c431457-kube-api-access-76gxm\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634166 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rj9\" (UniqueName: \"kubernetes.io/projected/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-kube-api-access-56rj9\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634183 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634208 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e522b21c-5710-4f2e-bff5-b76226c88d2f-apiservice-cert\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634234 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f53f5c28-7650-4388-981f-4ae86cbf8068-metrics-tls\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634250 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5bv\" (UniqueName: \"kubernetes.io/projected/e522b21c-5710-4f2e-bff5-b76226c88d2f-kube-api-access-5p5bv\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634275 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-config\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634297 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5408b201-966b-4e3e-b681-faaa2f32d633-config\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634320 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634349 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2254728a-2ece-43b6-8b5f-a16970277d33-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634371 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5408b201-966b-4e3e-b681-faaa2f32d633-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634393 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcm5r\" (UniqueName: \"kubernetes.io/projected/7c3552dc-a0cf-4072-91e1-030803f6014d-kube-api-access-tcm5r\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634417 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2508573-4890-48b6-9119-93560ee4c5d9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634476 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-encryption-config\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634500 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6rrv\" (UniqueName: \"kubernetes.io/projected/08bc2767-2a02-4bc1-b3cd-47670db11792-kube-api-access-v6rrv\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634524 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634546 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6cd\" (UniqueName: \"kubernetes.io/projected/b594b560-3db7-4be0-8799-f6eeac7d3aeb-kube-api-access-nq6cd\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634585 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-config-volume\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634601 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrlmq\" (UniqueName: \"kubernetes.io/projected/c5bf2447-4038-4736-999a-f5b87905a99a-kube-api-access-vrlmq\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634618 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bzw6\" (UniqueName: \"kubernetes.io/projected/90f5c875-3507-453d-9901-fd60a1476f71-kube-api-access-5bzw6\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634635 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-serving-cert\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634650 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-trusted-ca-bundle\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634666 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77mmb\" (UniqueName: \"kubernetes.io/projected/19b57874-b782-45bf-a257-3a162023a242-kube-api-access-77mmb\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634690 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-audit-policies\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634706 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f54243-6144-4ea9-acc6-b3f3c580e037-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634722 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-socket-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634739 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krxft\" (UniqueName: \"kubernetes.io/projected/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-kube-api-access-krxft\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634757 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634773 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22482\" (UniqueName: \"kubernetes.io/projected/354ca34c-c524-46db-91e6-7e99e8e1eebe-kube-api-access-22482\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634790 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-oauth-config\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634817 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2508573-4890-48b6-9119-93560ee4c5d9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634855 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-etcd-serving-ca\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634870 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-serving-cert\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634896 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce206052-32b9-4632-a7d7-4bd6630a6fd5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5ff9w\" (UID: \"ce206052-32b9-4632-a7d7-4bd6630a6fd5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.634917 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjrrc\" (UniqueName: \"kubernetes.io/projected/4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60-kube-api-access-bjrrc\") pod \"cluster-samples-operator-665b6dd947-w8527\" (UID: \"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" Oct 08 18:13:05 crc kubenswrapper[4750]: E1008 18:13:05.635219 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.13520568 +0000 UTC m=+142.048176693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.638467 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-registry-certificates\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.640859 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.668635 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-config\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.668906 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2508573-4890-48b6-9119-93560ee4c5d9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.669456 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5408b201-966b-4e3e-b681-faaa2f32d633-config\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.671186 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f54243-6144-4ea9-acc6-b3f3c580e037-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.677007 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f5c875-3507-453d-9901-fd60a1476f71-serving-cert\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.677056 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77eda81d-119c-4876-a951-e7262ed136b9-audit-dir\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.677821 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.677904 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-audit-dir\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.678610 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-client-ca\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.679530 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-console-config\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.681217 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-encryption-config\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.683127 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-encryption-config\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.684514 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-etcd-client\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.684540 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-trusted-ca\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.685416 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.685790 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-serving-cert\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.685846 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77eda81d-119c-4876-a951-e7262ed136b9-node-pullsecrets\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.686147 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-oauth-serving-cert\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.686400 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.686442 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-audit\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.686839 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-oauth-config\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.687065 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-image-import-ca\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.687901 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-service-ca\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.688829 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.688966 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-etcd-serving-ca\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.689772 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.689957 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.690466 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db0361d2-e90e-43d1-8ea7-29c7167d4d05-serving-cert\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.690783 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-registry-tls\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.690864 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db0361d2-e90e-43d1-8ea7-29c7167d4d05-service-ca-bundle\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.691630 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.692050 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-trusted-ca-bundle\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.692969 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2254728a-2ece-43b6-8b5f-a16970277d33-config\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.693088 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77eda81d-119c-4876-a951-e7262ed136b9-config\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.693344 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f53f5c28-7650-4388-981f-4ae86cbf8068-config-volume\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.693625 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-audit-policies\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.694087 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-serving-cert\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.694394 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-config\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.694574 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.695941 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w8527\" (UID: \"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.699977 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2254728a-2ece-43b6-8b5f-a16970277d33-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.700960 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f53f5c28-7650-4388-981f-4ae86cbf8068-metrics-tls\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.701073 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-serving-cert\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.710668 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znbhg\" (UniqueName: \"kubernetes.io/projected/f53f5c28-7650-4388-981f-4ae86cbf8068-kube-api-access-znbhg\") pod \"dns-default-n57nw\" (UID: \"f53f5c28-7650-4388-981f-4ae86cbf8068\") " pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.724651 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77eda81d-119c-4876-a951-e7262ed136b9-etcd-client\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.724814 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f54243-6144-4ea9-acc6-b3f3c580e037-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.701057 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.725254 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5408b201-966b-4e3e-b681-faaa2f32d633-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.726060 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2508573-4890-48b6-9119-93560ee4c5d9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737277 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-socket-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737337 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22482\" (UniqueName: \"kubernetes.io/projected/354ca34c-c524-46db-91e6-7e99e8e1eebe-kube-api-access-22482\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737371 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-serving-cert\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737397 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce206052-32b9-4632-a7d7-4bd6630a6fd5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5ff9w\" (UID: \"ce206052-32b9-4632-a7d7-4bd6630a6fd5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737436 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737473 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c5bf2447-4038-4736-999a-f5b87905a99a-images\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737498 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-config\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737599 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-metrics-certs\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737639 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9fa6043a-28e1-436e-9d43-f4f37c431457-srv-cert\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737667 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kts5r\" (UniqueName: \"kubernetes.io/projected/988fa55b-c750-4c47-aaf6-fb602e6477d7-kube-api-access-kts5r\") pod \"multus-admission-controller-857f4d67dd-xj8d6\" (UID: \"988fa55b-c750-4c47-aaf6-fb602e6477d7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737709 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wj2h\" (UniqueName: \"kubernetes.io/projected/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-kube-api-access-8wj2h\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737738 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/55eee756-293d-4ca8-91d0-6f95f22e72dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fsv6k\" (UID: \"55eee756-293d-4ca8-91d0-6f95f22e72dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737765 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e522b21c-5710-4f2e-bff5-b76226c88d2f-tmpfs\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737801 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wprks\" (UniqueName: \"kubernetes.io/projected/59a547a6-ce7d-41ea-a985-0e477deec34a-kube-api-access-wprks\") pod \"ingress-canary-wtkzp\" (UID: \"59a547a6-ce7d-41ea-a985-0e477deec34a\") " pod="openshift-ingress-canary/ingress-canary-wtkzp" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737834 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/19b57874-b782-45bf-a257-3a162023a242-signing-key\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737859 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhkq\" (UniqueName: \"kubernetes.io/projected/884c29cd-b721-400c-b319-510f191f02dd-kube-api-access-xwhkq\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737883 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-stats-auth\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737908 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9fa6043a-28e1-436e-9d43-f4f37c431457-profile-collector-cert\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737935 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a547a6-ce7d-41ea-a985-0e477deec34a-cert\") pod \"ingress-canary-wtkzp\" (UID: \"59a547a6-ce7d-41ea-a985-0e477deec34a\") " pod="openshift-ingress-canary/ingress-canary-wtkzp" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.737976 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b594b560-3db7-4be0-8799-f6eeac7d3aeb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738006 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738034 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-default-certificate\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738066 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b594b560-3db7-4be0-8799-f6eeac7d3aeb-srv-cert\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738097 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-mountpoint-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738160 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-csi-data-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738192 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-829vf\" (UniqueName: \"kubernetes.io/projected/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-kube-api-access-829vf\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738236 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738269 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-proxy-tls\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738300 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z27j6\" (UniqueName: \"kubernetes.io/projected/ce206052-32b9-4632-a7d7-4bd6630a6fd5-kube-api-access-z27j6\") pod \"package-server-manager-789f6589d5-5ff9w\" (UID: \"ce206052-32b9-4632-a7d7-4bd6630a6fd5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738325 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/19b57874-b782-45bf-a257-3a162023a242-signing-cabundle\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738372 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-secret-volume\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738404 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e522b21c-5710-4f2e-bff5-b76226c88d2f-webhook-cert\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738432 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-socket-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738440 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08bc2767-2a02-4bc1-b3cd-47670db11792-service-ca-bundle\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738506 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/884c29cd-b721-400c-b319-510f191f02dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738529 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-registration-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738629 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738657 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5bf2447-4038-4736-999a-f5b87905a99a-proxy-tls\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738679 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4tw2\" (UniqueName: \"kubernetes.io/projected/55eee756-293d-4ca8-91d0-6f95f22e72dc-kube-api-access-s4tw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fsv6k\" (UID: \"55eee756-293d-4ca8-91d0-6f95f22e72dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739301 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08bc2767-2a02-4bc1-b3cd-47670db11792-service-ca-bundle\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738697 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/988fa55b-c750-4c47-aaf6-fb602e6477d7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xj8d6\" (UID: \"988fa55b-c750-4c47-aaf6-fb602e6477d7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739786 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884c29cd-b721-400c-b319-510f191f02dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739811 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/354ca34c-c524-46db-91e6-7e99e8e1eebe-node-bootstrap-token\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739828 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-plugins-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739852 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5bf2447-4038-4736-999a-f5b87905a99a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739868 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77mrd\" (UniqueName: \"kubernetes.io/projected/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-kube-api-access-77mrd\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739882 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/354ca34c-c524-46db-91e6-7e99e8e1eebe-certs\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739909 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76gxm\" (UniqueName: \"kubernetes.io/projected/9fa6043a-28e1-436e-9d43-f4f37c431457-kube-api-access-76gxm\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739924 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56rj9\" (UniqueName: \"kubernetes.io/projected/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-kube-api-access-56rj9\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739948 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e522b21c-5710-4f2e-bff5-b76226c88d2f-apiservice-cert\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739967 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5bv\" (UniqueName: \"kubernetes.io/projected/e522b21c-5710-4f2e-bff5-b76226c88d2f-kube-api-access-5p5bv\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.739995 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.740027 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6rrv\" (UniqueName: \"kubernetes.io/projected/08bc2767-2a02-4bc1-b3cd-47670db11792-kube-api-access-v6rrv\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.738207 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-62m4q"] Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.740044 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6cd\" (UniqueName: \"kubernetes.io/projected/b594b560-3db7-4be0-8799-f6eeac7d3aeb-kube-api-access-nq6cd\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.742061 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-config-volume\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.742098 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrlmq\" (UniqueName: \"kubernetes.io/projected/c5bf2447-4038-4736-999a-f5b87905a99a-kube-api-access-vrlmq\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.742162 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77mmb\" (UniqueName: \"kubernetes.io/projected/19b57874-b782-45bf-a257-3a162023a242-kube-api-access-77mmb\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.742699 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b"] Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.743269 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-config-volume\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.743440 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5bf2447-4038-4736-999a-f5b87905a99a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.743461 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-csi-data-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.743568 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-registration-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: E1008 18:13:05.743844 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.243828792 +0000 UTC m=+142.156799805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.745020 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5408b201-966b-4e3e-b681-faaa2f32d633-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zt7f6\" (UID: \"5408b201-966b-4e3e-b681-faaa2f32d633\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.749763 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-plugins-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.751200 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884c29cd-b721-400c-b319-510f191f02dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.752135 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8"] Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.752246 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c5bf2447-4038-4736-999a-f5b87905a99a-images\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.752539 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e522b21c-5710-4f2e-bff5-b76226c88d2f-tmpfs\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.752814 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5bf2447-4038-4736-999a-f5b87905a99a-proxy-tls\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.753331 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-config\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.756080 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/354ca34c-c524-46db-91e6-7e99e8e1eebe-node-bootstrap-token\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.756155 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a547a6-ce7d-41ea-a985-0e477deec34a-cert\") pod \"ingress-canary-wtkzp\" (UID: \"59a547a6-ce7d-41ea-a985-0e477deec34a\") " pod="openshift-ingress-canary/ingress-canary-wtkzp" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.756986 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.758885 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-mountpoint-dir\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.759165 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/988fa55b-c750-4c47-aaf6-fb602e6477d7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xj8d6\" (UID: \"988fa55b-c750-4c47-aaf6-fb602e6477d7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.759653 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/19b57874-b782-45bf-a257-3a162023a242-signing-cabundle\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.759691 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce206052-32b9-4632-a7d7-4bd6630a6fd5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5ff9w\" (UID: \"ce206052-32b9-4632-a7d7-4bd6630a6fd5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.760059 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-metrics-certs\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.761804 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.762856 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqgpr\" (UniqueName: \"kubernetes.io/projected/95f54243-6144-4ea9-acc6-b3f3c580e037-kube-api-access-dqgpr\") pod \"kube-storage-version-migrator-operator-b67b599dd-gkppd\" (UID: \"95f54243-6144-4ea9-acc6-b3f3c580e037\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.764061 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-default-certificate\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.764387 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsm5j\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-kube-api-access-jsm5j\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.764536 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/08bc2767-2a02-4bc1-b3cd-47670db11792-stats-auth\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.766314 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.766368 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9fa6043a-28e1-436e-9d43-f4f37c431457-profile-collector-cert\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.766383 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/19b57874-b782-45bf-a257-3a162023a242-signing-key\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.766395 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/55eee756-293d-4ca8-91d0-6f95f22e72dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fsv6k\" (UID: \"55eee756-293d-4ca8-91d0-6f95f22e72dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.766660 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b594b560-3db7-4be0-8799-f6eeac7d3aeb-srv-cert\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.766686 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/884c29cd-b721-400c-b319-510f191f02dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.767715 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-serving-cert\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.768816 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcm5r\" (UniqueName: \"kubernetes.io/projected/7c3552dc-a0cf-4072-91e1-030803f6014d-kube-api-access-tcm5r\") pod \"console-f9d7485db-wprnh\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.770594 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9fa6043a-28e1-436e-9d43-f4f37c431457-srv-cert\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.771286 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e522b21c-5710-4f2e-bff5-b76226c88d2f-webhook-cert\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.772974 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b594b560-3db7-4be0-8799-f6eeac7d3aeb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.773536 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/354ca34c-c524-46db-91e6-7e99e8e1eebe-certs\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.774250 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-secret-volume\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.775848 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e522b21c-5710-4f2e-bff5-b76226c88d2f-apiservice-cert\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.777011 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-proxy-tls\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.797114 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-666l9\" (UniqueName: \"kubernetes.io/projected/db0361d2-e90e-43d1-8ea7-29c7167d4d05-kube-api-access-666l9\") pod \"authentication-operator-69f744f599-qt2tv\" (UID: \"db0361d2-e90e-43d1-8ea7-29c7167d4d05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.814243 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqhs4\" (UniqueName: \"kubernetes.io/projected/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-kube-api-access-lqhs4\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.833027 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hp5m\" (UniqueName: \"kubernetes.io/projected/4f16bfec-96c4-4e7b-83e1-aed47eecbefb-kube-api-access-5hp5m\") pod \"openshift-apiserver-operator-796bbdcf4f-s7qzd\" (UID: \"4f16bfec-96c4-4e7b-83e1-aed47eecbefb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.841744 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.842480 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:05 crc kubenswrapper[4750]: E1008 18:13:05.843336 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.343313117 +0000 UTC m=+142.256284130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.847183 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: E1008 18:13:05.847658 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.347641769 +0000 UTC m=+142.260612782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.863897 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.864305 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-bound-sa-token\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.872478 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bvwfn"] Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.876460 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjrrc\" (UniqueName: \"kubernetes.io/projected/4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60-kube-api-access-bjrrc\") pod \"cluster-samples-operator-665b6dd947-w8527\" (UID: \"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.895471 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bzw6\" (UniqueName: \"kubernetes.io/projected/90f5c875-3507-453d-9901-fd60a1476f71-kube-api-access-5bzw6\") pod \"controller-manager-879f6c89f-4zxlq\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.911221 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fplk6\" (UniqueName: \"kubernetes.io/projected/77eda81d-119c-4876-a951-e7262ed136b9-kube-api-access-fplk6\") pod \"apiserver-76f77b778f-vwt7h\" (UID: \"77eda81d-119c-4876-a951-e7262ed136b9\") " pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.935181 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/346b036e-ea6a-4cc3-ab4a-7dbbdf59405c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4lzzt\" (UID: \"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.936595 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.946529 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.953758 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2254728a-2ece-43b6-8b5f-a16970277d33-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-44fvq\" (UID: \"2254728a-2ece-43b6-8b5f-a16970277d33\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.953897 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.954420 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:05 crc kubenswrapper[4750]: E1008 18:13:05.955260 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.455191065 +0000 UTC m=+142.368162078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.955406 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:05 crc kubenswrapper[4750]: E1008 18:13:05.955886 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.455877801 +0000 UTC m=+142.368848814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.960186 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.974870 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krxft\" (UniqueName: \"kubernetes.io/projected/8f878d47-ed9f-4f96-a7c0-153c58ff33cf-kube-api-access-krxft\") pod \"apiserver-7bbb656c7d-5g5nc\" (UID: \"8f878d47-ed9f-4f96-a7c0-153c58ff33cf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.980935 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.991139 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:05 crc kubenswrapper[4750]: I1008 18:13:05.996633 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9qts\" (UniqueName: \"kubernetes.io/projected/80400620-75a2-4d87-ad1a-2ef29346babc-kube-api-access-j9qts\") pod \"migrator-59844c95c7-rdtt7\" (UID: \"80400620-75a2-4d87-ad1a-2ef29346babc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.017260 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6cd\" (UniqueName: \"kubernetes.io/projected/b594b560-3db7-4be0-8799-f6eeac7d3aeb-kube-api-access-nq6cd\") pod \"olm-operator-6b444d44fb-l6dqx\" (UID: \"b594b560-3db7-4be0-8799-f6eeac7d3aeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.025189 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fzfgf"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.025277 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.031180 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.045427 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77mmb\" (UniqueName: \"kubernetes.io/projected/19b57874-b782-45bf-a257-3a162023a242-kube-api-access-77mmb\") pod \"service-ca-9c57cc56f-kjc7k\" (UID: \"19b57874-b782-45bf-a257-3a162023a242\") " pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.052816 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrlmq\" (UniqueName: \"kubernetes.io/projected/c5bf2447-4038-4736-999a-f5b87905a99a-kube-api-access-vrlmq\") pod \"machine-config-operator-74547568cd-hsr6t\" (UID: \"c5bf2447-4038-4736-999a-f5b87905a99a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.056439 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.057059 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.557033535 +0000 UTC m=+142.470004548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.077768 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4tw2\" (UniqueName: \"kubernetes.io/projected/55eee756-293d-4ca8-91d0-6f95f22e72dc-kube-api-access-s4tw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fsv6k\" (UID: \"55eee756-293d-4ca8-91d0-6f95f22e72dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.098875 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22482\" (UniqueName: \"kubernetes.io/projected/354ca34c-c524-46db-91e6-7e99e8e1eebe-kube-api-access-22482\") pod \"machine-config-server-mfdcf\" (UID: \"354ca34c-c524-46db-91e6-7e99e8e1eebe\") " pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.099235 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mfdcf" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.106723 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jw49l"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.111777 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77mrd\" (UniqueName: \"kubernetes.io/projected/1c3d5788-4162-4ca6-b3d4-80d5d3062ea3-kube-api-access-77mrd\") pod \"machine-config-controller-84d6567774-xz2wz\" (UID: \"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.145039 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-59hsn"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.149708 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.150068 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wprks\" (UniqueName: \"kubernetes.io/projected/59a547a6-ce7d-41ea-a985-0e477deec34a-kube-api-access-wprks\") pod \"ingress-canary-wtkzp\" (UID: \"59a547a6-ce7d-41ea-a985-0e477deec34a\") " pod="openshift-ingress-canary/ingress-canary-wtkzp" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.156162 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hzx59"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.159181 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.159907 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.659884441 +0000 UTC m=+142.572855454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.171136 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rj9\" (UniqueName: \"kubernetes.io/projected/2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8-kube-api-access-56rj9\") pod \"csi-hostpathplugin-n87fc\" (UID: \"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8\") " pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.174079 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76gxm\" (UniqueName: \"kubernetes.io/projected/9fa6043a-28e1-436e-9d43-f4f37c431457-kube-api-access-76gxm\") pod \"catalog-operator-68c6474976-wrzwf\" (UID: \"9fa6043a-28e1-436e-9d43-f4f37c431457\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.187903 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.198919 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kts5r\" (UniqueName: \"kubernetes.io/projected/988fa55b-c750-4c47-aaf6-fb602e6477d7-kube-api-access-kts5r\") pod \"multus-admission-controller-857f4d67dd-xj8d6\" (UID: \"988fa55b-c750-4c47-aaf6-fb602e6477d7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.213279 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6rrv\" (UniqueName: \"kubernetes.io/projected/08bc2767-2a02-4bc1-b3cd-47670db11792-kube-api-access-v6rrv\") pod \"router-default-5444994796-tdftm\" (UID: \"08bc2767-2a02-4bc1-b3cd-47670db11792\") " pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.216304 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.229949 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.235191 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wj2h\" (UniqueName: \"kubernetes.io/projected/77a8f6e2-2a14-4876-b61c-4f9eb7581f90-kube-api-access-8wj2h\") pod \"service-ca-operator-777779d784-2dhb9\" (UID: \"77a8f6e2-2a14-4876-b61c-4f9eb7581f90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:06 crc kubenswrapper[4750]: W1008 18:13:06.257293 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1619a1_f1c5_455f_814e_0cff00e053c0.slice/crio-948c7a0fddcba9758a9ae32d04426d9f99f6857dce9b8755731709e1e58af523 WatchSource:0}: Error finding container 948c7a0fddcba9758a9ae32d04426d9f99f6857dce9b8755731709e1e58af523: Status 404 returned error can't find the container with id 948c7a0fddcba9758a9ae32d04426d9f99f6857dce9b8755731709e1e58af523 Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.265171 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.265460 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.765432109 +0000 UTC m=+142.678403122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.265732 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.266131 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.766121736 +0000 UTC m=+142.679092749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.266477 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.267307 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z27j6\" (UniqueName: \"kubernetes.io/projected/ce206052-32b9-4632-a7d7-4bd6630a6fd5-kube-api-access-z27j6\") pod \"package-server-manager-789f6589d5-5ff9w\" (UID: \"ce206052-32b9-4632-a7d7-4bd6630a6fd5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.274752 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.280365 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-829vf\" (UniqueName: \"kubernetes.io/projected/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-kube-api-access-829vf\") pod \"collect-profiles-29332440-n7qpc\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.282226 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.291630 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.297165 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.302571 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a05c13-3cd6-48ce-ba2f-6cfdf1a51093-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkkbr\" (UID: \"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.311841 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.316585 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.321425 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5bv\" (UniqueName: \"kubernetes.io/projected/e522b21c-5710-4f2e-bff5-b76226c88d2f-kube-api-access-5p5bv\") pod \"packageserver-d55dfcdfc-glbv9\" (UID: \"e522b21c-5710-4f2e-bff5-b76226c88d2f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.321985 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.332290 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.335749 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhkq\" (UniqueName: \"kubernetes.io/projected/884c29cd-b721-400c-b319-510f191f02dd-kube-api-access-xwhkq\") pod \"marketplace-operator-79b997595-hg6dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.340539 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.355903 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.361575 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.367174 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.369597 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.369725 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.869699098 +0000 UTC m=+142.782670111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.370480 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.371183 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.871128792 +0000 UTC m=+142.784099805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.371479 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.390377 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n87fc" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.398832 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" event={"ID":"8b2784aa-3090-4fd9-a1cb-28b99b181a64","Type":"ContainerStarted","Data":"a482f76f72a8d84b2c5b618f059274a708ec845b416b7c237c77f7346b1ef980"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.405500 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wtkzp" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.411635 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" event={"ID":"1a37070c-45cb-4989-9f51-b229609be506","Type":"ContainerStarted","Data":"548dde05ef1fc12cf3111c0a9958172a3379d12e18570111c4bccf948e5b4346"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.411676 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" event={"ID":"1a37070c-45cb-4989-9f51-b229609be506","Type":"ContainerStarted","Data":"1ea58e82475be500749f792197a7cbdac124c06c721f14606317d95c805c96d3"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.416401 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" event={"ID":"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202","Type":"ContainerStarted","Data":"f30fd49d59ddc9e20196d31ee16c4fe268c6fcb86a37b8a951d1113c130ba22e"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.416441 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" event={"ID":"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202","Type":"ContainerStarted","Data":"d793ddd71db2d29ea806e4697db1afb6d2ae9716fafc89ab6309a78480e0b9a3"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.454624 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mfdcf" event={"ID":"354ca34c-c524-46db-91e6-7e99e8e1eebe","Type":"ContainerStarted","Data":"1d6be48beda83bb250cc7ed0dbf70c986e446db0204fe4530b3b8ef2eda1962f"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.463965 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" event={"ID":"bbde5089-5fb0-4faf-9788-21ca7048e6f2","Type":"ContainerStarted","Data":"6cda24b567e15afa128f6f2a2958bd9fff4fd63013b85172f78c649f56906ec9"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.472802 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.473201 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:06.973159247 +0000 UTC m=+142.886130260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.474474 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" event={"ID":"88aafa5f-15c0-43af-80be-1c01d844c9c9","Type":"ContainerStarted","Data":"0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.474539 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" event={"ID":"88aafa5f-15c0-43af-80be-1c01d844c9c9","Type":"ContainerStarted","Data":"47cb1a78c9333f94b16886ddf51a50dfa2c78a55d94319d9a120ca9a2b5206bc"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.474912 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.477185 4750 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-62m4q container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.477229 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" podUID="88aafa5f-15c0-43af-80be-1c01d844c9c9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.477580 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" event={"ID":"1ca846a3-8fac-4636-8a95-5c3b877d3477","Type":"ContainerStarted","Data":"063f16d6984bf4a4316855029675508270d8c5497e233e7296e24f1941949888"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.480359 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fzfgf" event={"ID":"84dd6602-6fa4-4615-a79c-d571fec1a58c","Type":"ContainerStarted","Data":"0dd51750360ecca4f3048f042dde6d5a42237d19ed51b456251b6dac699b0f02"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.485355 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" event={"ID":"a6530eb5-a257-4381-a176-b0e0972181ac","Type":"ContainerStarted","Data":"22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.485383 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" event={"ID":"a6530eb5-a257-4381-a176-b0e0972181ac","Type":"ContainerStarted","Data":"188d58463632b1247c6c47ced7a84c2e795d9ae56a4a2dbd2c26460b1c38cb05"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.485734 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.488917 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jw49l" event={"ID":"8d1619a1-f1c5-455f-814e-0cff00e053c0","Type":"ContainerStarted","Data":"948c7a0fddcba9758a9ae32d04426d9f99f6857dce9b8755731709e1e58af523"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.489248 4750 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-69h7b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.489289 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" podUID="a6530eb5-a257-4381-a176-b0e0972181ac" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.491862 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" event={"ID":"d0696be9-7ac8-46ce-9839-f74ae867f317","Type":"ContainerStarted","Data":"b9bd4d913975e9fd755b8029aa4a693c8c94d4867ebe5ecd35bbb1b074bc57da"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.494384 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" event={"ID":"d140a287-5d7c-4379-bc27-a6156ff3379d","Type":"ContainerStarted","Data":"5e9577ec13da561ac0e47f5b1242880fabf0529a555b71534ee6d29277240e68"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.494415 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" event={"ID":"d140a287-5d7c-4379-bc27-a6156ff3379d","Type":"ContainerStarted","Data":"62338abb7c87899a28d3c0d8d0244c1d2d219a11469fd21d570a8f3148fc1441"} Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.538618 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qt2tv"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.547196 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4zxlq"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.557369 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.570162 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wprnh"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.576604 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.577011 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.076992925 +0000 UTC m=+142.989963938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.683820 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.684657 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.184633743 +0000 UTC m=+143.097604756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.684808 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.685107 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.185097534 +0000 UTC m=+143.098068537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.688209 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6"] Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.785401 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.785592 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.285564843 +0000 UTC m=+143.198535856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.787870 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.788371 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.288360669 +0000 UTC m=+143.201331682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.889340 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.890102 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.390082486 +0000 UTC m=+143.303053499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:06 crc kubenswrapper[4750]: W1008 18:13:06.901258 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5408b201_966b_4e3e_b681_faaa2f32d633.slice/crio-4e9ed9cf69e55ba52fdfca72e733e45b4b73106112a2739ff3af1ed60ba0f7dd WatchSource:0}: Error finding container 4e9ed9cf69e55ba52fdfca72e733e45b4b73106112a2739ff3af1ed60ba0f7dd: Status 404 returned error can't find the container with id 4e9ed9cf69e55ba52fdfca72e733e45b4b73106112a2739ff3af1ed60ba0f7dd Oct 08 18:13:06 crc kubenswrapper[4750]: I1008 18:13:06.992746 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:06 crc kubenswrapper[4750]: E1008 18:13:06.993090 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.493076945 +0000 UTC m=+143.406047958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.094094 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.094785 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.594725322 +0000 UTC m=+143.507724895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.123833 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.175387 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt"] Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.191334 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n57nw"] Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.196378 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.199237 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.699222975 +0000 UTC m=+143.612193988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.298210 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.299145 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.79912433 +0000 UTC m=+143.712095353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.363341 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" podStartSLOduration=124.363324269 podStartE2EDuration="2m4.363324269s" podCreationTimestamp="2025-10-08 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:07.362619693 +0000 UTC m=+143.275590706" watchObservedRunningTime="2025-10-08 18:13:07.363324269 +0000 UTC m=+143.276295272" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.400763 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.401270 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:07.901252177 +0000 UTC m=+143.814223190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.505057 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.505504 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.005478555 +0000 UTC m=+143.918449578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.505826 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.506633 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.00652803 +0000 UTC m=+143.919499043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.506768 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" event={"ID":"5408b201-966b-4e3e-b681-faaa2f32d633","Type":"ContainerStarted","Data":"4e9ed9cf69e55ba52fdfca72e733e45b4b73106112a2739ff3af1ed60ba0f7dd"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.534824 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tdftm" event={"ID":"08bc2767-2a02-4bc1-b3cd-47670db11792","Type":"ContainerStarted","Data":"b83b6b30ca02c0c69ca01b2983b697f8041b70871c18cca58b75ac9ef6fc3f05"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.535228 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tdftm" event={"ID":"08bc2767-2a02-4bc1-b3cd-47670db11792","Type":"ContainerStarted","Data":"f0393ceb8c4080a9744e3782eef442e392bf8bc47871da7975f7907763b8dd08"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.544669 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" event={"ID":"a20bcaf3-28dc-41f5-8c3e-25b3dfad8202","Type":"ContainerStarted","Data":"da6cc2b968ad96a5bac9e7db63f4b2ba8503c87e27e0b25d48976c0e5588b295"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.556640 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n57nw" event={"ID":"f53f5c28-7650-4388-981f-4ae86cbf8068","Type":"ContainerStarted","Data":"9af976900a2536e2f9e6b611eac9a4bccdcecfac65edb747332eb8f10d32c2ef"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.600162 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" event={"ID":"db0361d2-e90e-43d1-8ea7-29c7167d4d05","Type":"ContainerStarted","Data":"586e7326a50a5b829b150d99c8d4aa8865ad52efc8574d7047b044f0265d7f19"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.600460 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" event={"ID":"db0361d2-e90e-43d1-8ea7-29c7167d4d05","Type":"ContainerStarted","Data":"334cffb71ab126f1314a54f3b79cee60bd1303642befeb9be1c20448f2d083c4"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.610509 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.610730 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.110688486 +0000 UTC m=+144.023659499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.614822 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.617800 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.117780103 +0000 UTC m=+144.030751116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.618505 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jw49l" event={"ID":"8d1619a1-f1c5-455f-814e-0cff00e053c0","Type":"ContainerStarted","Data":"595133e965de9ea9276e48b82fcd240fea7425d520d3d90564a224e4c21c1b4f"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.632976 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" event={"ID":"95f54243-6144-4ea9-acc6-b3f3c580e037","Type":"ContainerStarted","Data":"dd3cd426554c47610f7bb40e427b3734e4a39f1dd2daba91941838ac8a2a2c72"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.633032 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" event={"ID":"95f54243-6144-4ea9-acc6-b3f3c580e037","Type":"ContainerStarted","Data":"a4ccbc9ffe11f35741ce0c0836b70adf165a3ea8c414715f3b6da681fea1cb5d"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.651002 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v6gc8" podStartSLOduration=123.650973229 podStartE2EDuration="2m3.650973229s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:07.650280993 +0000 UTC m=+143.563252016" watchObservedRunningTime="2025-10-08 18:13:07.650973229 +0000 UTC m=+143.563944242" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.663028 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" event={"ID":"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c","Type":"ContainerStarted","Data":"509dce022a97fa539018ad67f196eb443963b5d203af67b8aa2eb332bffa344e"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.696923 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mfdcf" event={"ID":"354ca34c-c524-46db-91e6-7e99e8e1eebe","Type":"ContainerStarted","Data":"24e78478e2763a5b82a8a70f8ba91831d9962a1f3584416ed55d6f8c604117d2"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.716209 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.723051 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wprnh" event={"ID":"7c3552dc-a0cf-4072-91e1-030803f6014d","Type":"ContainerStarted","Data":"1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.723103 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wprnh" event={"ID":"7c3552dc-a0cf-4072-91e1-030803f6014d","Type":"ContainerStarted","Data":"bd034864bd4459958435f81400ed7f11faeb47aae5ce740cd76749145fac5d1c"} Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.732742 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.232707314 +0000 UTC m=+144.145678327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.735935 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" event={"ID":"d0696be9-7ac8-46ce-9839-f74ae867f317","Type":"ContainerStarted","Data":"9bdad5868d74f4c7c8430933733a399b573784830a533077b1730a933733a88f"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.735984 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" event={"ID":"d0696be9-7ac8-46ce-9839-f74ae867f317","Type":"ContainerStarted","Data":"42b44402870c12bfeb29b37bcd95573356a91b7a99788a30c01ee5585acab7bf"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.736917 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" podStartSLOduration=123.736878693 podStartE2EDuration="2m3.736878693s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:07.682176888 +0000 UTC m=+143.595147911" watchObservedRunningTime="2025-10-08 18:13:07.736878693 +0000 UTC m=+143.649849706" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.803989 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" event={"ID":"8b2784aa-3090-4fd9-a1cb-28b99b181a64","Type":"ContainerStarted","Data":"6cc94cfa76f3ed9d607b4d456c7462cee7b1942b77baa7a032b2aaed017d4395"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.836742 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fzfgf" event={"ID":"84dd6602-6fa4-4615-a79c-d571fec1a58c","Type":"ContainerStarted","Data":"413e44477c33cca6c803290c2bd15bd3462f6e6819b3928914d8bcac217a21a2"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.839416 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.851154 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.351127068 +0000 UTC m=+144.264098081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.852607 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.868371 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527"] Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.872127 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" event={"ID":"90f5c875-3507-453d-9901-fd60a1476f71","Type":"ContainerStarted","Data":"00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.872164 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" event={"ID":"90f5c875-3507-453d-9901-fd60a1476f71","Type":"ContainerStarted","Data":"37d3972a4ef95ee2f606e183a67b5b6aece3fad52bef027f385080c76eb38284"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.872480 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bk84c" podStartSLOduration=124.870598339 podStartE2EDuration="2m4.870598339s" podCreationTimestamp="2025-10-08 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:07.866388389 +0000 UTC m=+143.779359402" watchObservedRunningTime="2025-10-08 18:13:07.870598339 +0000 UTC m=+143.783569352" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.873276 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.881153 4750 patch_prober.go:28] interesting pod/console-operator-58897d9998-fzfgf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.881233 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fzfgf" podUID="84dd6602-6fa4-4615-a79c-d571fec1a58c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.887179 4750 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4zxlq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.887279 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" podUID="90f5c875-3507-453d-9901-fd60a1476f71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.890412 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" event={"ID":"bbde5089-5fb0-4faf-9788-21ca7048e6f2","Type":"ContainerStarted","Data":"8bfcc3897b8747e8e019ad02fe7f68d3f893bc61e4d7a517a62b518432714272"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.904702 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t"] Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.917804 4750 generic.go:334] "Generic (PLEG): container finished" podID="1ca846a3-8fac-4636-8a95-5c3b877d3477" containerID="c1c1513857c63caf104f8b356c1c43cd40153ec6fc2a6153b0b9c7f10ae21c25" exitCode=0 Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.918846 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" event={"ID":"1ca846a3-8fac-4636-8a95-5c3b877d3477","Type":"ContainerDied","Data":"c1c1513857c63caf104f8b356c1c43cd40153ec6fc2a6153b0b9c7f10ae21c25"} Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.956505 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7"] Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.957503 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.960415 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.967937 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:07 crc kubenswrapper[4750]: E1008 18:13:07.970338 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.470303069 +0000 UTC m=+144.383274082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.985890 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq"] Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.986103 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xj8d6"] Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.988761 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc"] Oct 08 18:13:07 crc kubenswrapper[4750]: I1008 18:13:07.991999 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.015742 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.016871 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vwt7h"] Oct 08 18:13:08 crc kubenswrapper[4750]: W1008 18:13:08.038979 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f878d47_ed9f_4f96_a7c0_153c58ff33cf.slice/crio-933e752132f2a420d784301212554720687de6292e8cbdc538483a6709dc0ae5 WatchSource:0}: Error finding container 933e752132f2a420d784301212554720687de6292e8cbdc538483a6709dc0ae5: Status 404 returned error can't find the container with id 933e752132f2a420d784301212554720687de6292e8cbdc538483a6709dc0ae5 Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.070023 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fzfgf" podStartSLOduration=124.069994659 podStartE2EDuration="2m4.069994659s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.069671221 +0000 UTC m=+143.982642254" watchObservedRunningTime="2025-10-08 18:13:08.069994659 +0000 UTC m=+143.982965672" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.071309 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.076611 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.576594045 +0000 UTC m=+144.489565058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.125865 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-bvwfn" podStartSLOduration=124.125848911 podStartE2EDuration="2m4.125848911s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.123197808 +0000 UTC m=+144.036168831" watchObservedRunningTime="2025-10-08 18:13:08.125848911 +0000 UTC m=+144.038819924" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.172908 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.173207 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.673192562 +0000 UTC m=+144.586163575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.197681 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gkppd" podStartSLOduration=124.197664231 podStartE2EDuration="2m4.197664231s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.19590941 +0000 UTC m=+144.108880423" watchObservedRunningTime="2025-10-08 18:13:08.197664231 +0000 UTC m=+144.110635244" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.243019 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4h2qz" podStartSLOduration=124.243005145 podStartE2EDuration="2m4.243005145s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.241774496 +0000 UTC m=+144.154745509" watchObservedRunningTime="2025-10-08 18:13:08.243005145 +0000 UTC m=+144.155976148" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.274703 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.275073 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.775060044 +0000 UTC m=+144.688031057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.311908 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tdftm" podStartSLOduration=124.311888995 podStartE2EDuration="2m4.311888995s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.287221571 +0000 UTC m=+144.200192584" watchObservedRunningTime="2025-10-08 18:13:08.311888995 +0000 UTC m=+144.224860008" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.315248 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.350467 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.363367 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.363464 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kjc7k"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.376404 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.376976 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.876958775 +0000 UTC m=+144.789929788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.399303 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg6dd"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.406085 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.407678 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4r4tz" podStartSLOduration=124.407659332 podStartE2EDuration="2m4.407659332s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.386136193 +0000 UTC m=+144.299107216" watchObservedRunningTime="2025-10-08 18:13:08.407659332 +0000 UTC m=+144.320630365" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.417899 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:08 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:08 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:08 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.417978 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.441628 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wprnh" podStartSLOduration=124.441602636 podStartE2EDuration="2m4.441602636s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.432220854 +0000 UTC m=+144.345191867" watchObservedRunningTime="2025-10-08 18:13:08.441602636 +0000 UTC m=+144.354573659" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.456602 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.465091 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mfdcf" podStartSLOduration=5.465074621 podStartE2EDuration="5.465074621s" podCreationTimestamp="2025-10-08 18:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.463795091 +0000 UTC m=+144.376766114" watchObservedRunningTime="2025-10-08 18:13:08.465074621 +0000 UTC m=+144.378045634" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.468428 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n87fc"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.476906 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.477711 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.478062 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:08.978051219 +0000 UTC m=+144.891022232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.510353 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" podStartSLOduration=124.510329313 podStartE2EDuration="2m4.510329313s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.50556302 +0000 UTC m=+144.418534033" watchObservedRunningTime="2025-10-08 18:13:08.510329313 +0000 UTC m=+144.423300326" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.517642 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.517761 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.546764 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wtkzp"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.547207 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qt2tv" podStartSLOduration=125.547196986 podStartE2EDuration="2m5.547196986s" podCreationTimestamp="2025-10-08 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.537503256 +0000 UTC m=+144.450474269" watchObservedRunningTime="2025-10-08 18:13:08.547196986 +0000 UTC m=+144.460167999" Oct 08 18:13:08 crc kubenswrapper[4750]: W1008 18:13:08.548755 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55eee756_293d_4ca8_91d0_6f95f22e72dc.slice/crio-c715d399e5d200848ac39826e9ed76d2ae90027b1d34effa635f0880fa68e4ed WatchSource:0}: Error finding container c715d399e5d200848ac39826e9ed76d2ae90027b1d34effa635f0880fa68e4ed: Status 404 returned error can't find the container with id c715d399e5d200848ac39826e9ed76d2ae90027b1d34effa635f0880fa68e4ed Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.559418 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc"] Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.578921 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.579843 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.079197843 +0000 UTC m=+144.992168856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.579978 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.580486 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.080478264 +0000 UTC m=+144.993449277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.590160 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jw49l" podStartSLOduration=124.590139723 podStartE2EDuration="2m4.590139723s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:08.586994067 +0000 UTC m=+144.499965080" watchObservedRunningTime="2025-10-08 18:13:08.590139723 +0000 UTC m=+144.503110736" Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.682587 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.684208 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.184183999 +0000 UTC m=+145.097155012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.684501 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.684818 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.184811893 +0000 UTC m=+145.097782906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.786067 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.786222 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.286191074 +0000 UTC m=+145.199162087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.786367 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.786745 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.286730736 +0000 UTC m=+145.199701749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.888298 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.888856 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.388833923 +0000 UTC m=+145.301804926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.990607 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:08 crc kubenswrapper[4750]: E1008 18:13:08.990969 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.490951971 +0000 UTC m=+145.403922984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.992841 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" event={"ID":"b594b560-3db7-4be0-8799-f6eeac7d3aeb","Type":"ContainerStarted","Data":"7ab092077cf776f37cd48748ff3d8b1b5927e258b6cd5467062feb3362f68d3b"} Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.992905 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" event={"ID":"b594b560-3db7-4be0-8799-f6eeac7d3aeb","Type":"ContainerStarted","Data":"6f872c26214a3f3251df8446be385ca18764f9009c98202cbf5efaa972fe9727"} Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.995288 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" event={"ID":"346b036e-ea6a-4cc3-ab4a-7dbbdf59405c","Type":"ContainerStarted","Data":"6f5a636a709f5bed4c12a5f73d5f8e96c25fb651ac15bdf74492094553941bc7"} Oct 08 18:13:08 crc kubenswrapper[4750]: I1008 18:13:08.997383 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.002223 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" event={"ID":"9fa6043a-28e1-436e-9d43-f4f37c431457","Type":"ContainerStarted","Data":"beec093a732dee01436f7ba35b8d5d767f08171d39ef557157eb3a7d325051f3"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.009575 4750 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-l6dqx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.009634 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" podUID="b594b560-3db7-4be0-8799-f6eeac7d3aeb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.016136 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" event={"ID":"1ca846a3-8fac-4636-8a95-5c3b877d3477","Type":"ContainerStarted","Data":"f844755ae9549c20c99edc0d6ddcb3587526bc3796cc22390daae9a373acb318"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.018398 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.016804 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" podStartSLOduration=125.016793372 podStartE2EDuration="2m5.016793372s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:09.01582285 +0000 UTC m=+144.928793893" watchObservedRunningTime="2025-10-08 18:13:09.016793372 +0000 UTC m=+144.929764375" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.031822 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n87fc" event={"ID":"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8","Type":"ContainerStarted","Data":"7ebd39fce43d67be7278bbf3596b421e1a22d646f94752b95e962bbfc2e9d660"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.043919 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" event={"ID":"4227b4db-94b2-4e5f-a231-aa0b6bbe685c","Type":"ContainerStarted","Data":"c32ad658728639322e0ec7bf39719dc9f200b613ee715ada152065440ad53293"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.055798 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" event={"ID":"884c29cd-b721-400c-b319-510f191f02dd","Type":"ContainerStarted","Data":"3cfce55127e4cf1976066d168746f6616ca371b83bf3acaebd6ddd49f7f2a57a"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.074467 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n57nw" event={"ID":"f53f5c28-7650-4388-981f-4ae86cbf8068","Type":"ContainerStarted","Data":"7c7d525e199002a8826dad87b54609ca4e04a0445d25926bf74e7bfcd3a17cd7"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.074998 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n57nw" event={"ID":"f53f5c28-7650-4388-981f-4ae86cbf8068","Type":"ContainerStarted","Data":"d0de86bb66542c8c8f9f969ada2185e023731797103df8d17aeeff87a500fd9c"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.075022 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.074856 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" podStartSLOduration=125.074835257 podStartE2EDuration="2m5.074835257s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:09.074066068 +0000 UTC m=+144.987037091" watchObservedRunningTime="2025-10-08 18:13:09.074835257 +0000 UTC m=+144.987806270" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.075395 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lzzt" podStartSLOduration=125.075382059 podStartE2EDuration="2m5.075382059s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:09.041972478 +0000 UTC m=+144.954943511" watchObservedRunningTime="2025-10-08 18:13:09.075382059 +0000 UTC m=+144.988353072" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.092645 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.093955 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.593925468 +0000 UTC m=+145.506896481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.094486 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.097138 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.597114974 +0000 UTC m=+145.510085987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.111644 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n57nw" podStartSLOduration=6.111619467 podStartE2EDuration="6.111619467s" podCreationTimestamp="2025-10-08 18:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:09.108373581 +0000 UTC m=+145.021344594" watchObservedRunningTime="2025-10-08 18:13:09.111619467 +0000 UTC m=+145.024590480" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.119256 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" event={"ID":"988fa55b-c750-4c47-aaf6-fb602e6477d7","Type":"ContainerStarted","Data":"a3767fa7afb51fca7413385942e475132fd34dd8b4b0f04a4f970d79a01879b4"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.119439 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" event={"ID":"988fa55b-c750-4c47-aaf6-fb602e6477d7","Type":"ContainerStarted","Data":"991ae4fde37f692557dc73f3ee50eb40ac9be2b877ec682f9cab7ed56258823c"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.131384 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" event={"ID":"19b57874-b782-45bf-a257-3a162023a242","Type":"ContainerStarted","Data":"4e76bf1c3a0bb35c34948e169bc2a5c55f77989c64fb52396a4411c94f112a7b"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.142579 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" event={"ID":"5408b201-966b-4e3e-b681-faaa2f32d633","Type":"ContainerStarted","Data":"7772b9b70e7e939286bb3787ce312db366846e6e361af069b63427a2f203c4ec"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.146361 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wtkzp" event={"ID":"59a547a6-ce7d-41ea-a985-0e477deec34a","Type":"ContainerStarted","Data":"b5dc696ec20aed9b4ef59a14ec25b59233f2cee56527f44297d814c854538a50"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.147576 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" event={"ID":"8b2784aa-3090-4fd9-a1cb-28b99b181a64","Type":"ContainerStarted","Data":"f13ce92411c256968c3d07d18ef88c1c1d88722bebea23ad83e567ba86bcb4cd"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.149025 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" event={"ID":"ce206052-32b9-4632-a7d7-4bd6630a6fd5","Type":"ContainerStarted","Data":"a9a71fdd0f83cf97d4754cdb52d6683c601d6d6dcb239fb8d8729fee1b77e0c5"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.171289 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zt7f6" podStartSLOduration=125.17126343 podStartE2EDuration="2m5.17126343s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:09.162972363 +0000 UTC m=+145.075943386" watchObservedRunningTime="2025-10-08 18:13:09.17126343 +0000 UTC m=+145.084234453" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.185798 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" event={"ID":"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60","Type":"ContainerStarted","Data":"6e08366fbf47e362f69bc912a211fff2501d666e218fe9dbda0bae1af8aebe10"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.185985 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" event={"ID":"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60","Type":"ContainerStarted","Data":"7340e469656f8468fc01a4f0a5336db8d9dbbc04cf5704f95e9faa93a0daf00c"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.195197 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.196953 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.696907556 +0000 UTC m=+145.609878749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.200151 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" event={"ID":"4f16bfec-96c4-4e7b-83e1-aed47eecbefb","Type":"ContainerStarted","Data":"a4b6b41fe1ebdd83f426285d5c47c20efbd5a736ea9cb040a979b927e093a2df"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.200199 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" event={"ID":"4f16bfec-96c4-4e7b-83e1-aed47eecbefb","Type":"ContainerStarted","Data":"0b40919af403a99e9746ee2fef1bcf464632ba742a6ecdf9177f93347c38c0b7"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.219666 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" event={"ID":"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3","Type":"ContainerStarted","Data":"5b6a95ea647ea492a36700ffc685b947f8c238d6008766efad8f0e9008094b05"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.258221 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-59hsn" podStartSLOduration=125.258193817 podStartE2EDuration="2m5.258193817s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:09.186059029 +0000 UTC m=+145.099030052" watchObservedRunningTime="2025-10-08 18:13:09.258193817 +0000 UTC m=+145.171164830" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.259283 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s7qzd" podStartSLOduration=126.259273702 podStartE2EDuration="2m6.259273702s" podCreationTimestamp="2025-10-08 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:09.247875043 +0000 UTC m=+145.160846056" watchObservedRunningTime="2025-10-08 18:13:09.259273702 +0000 UTC m=+145.172244715" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.298271 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" event={"ID":"c5bf2447-4038-4736-999a-f5b87905a99a","Type":"ContainerStarted","Data":"e5e82172205f445d9528cd388a6bf5bed0802c48e3ac4c6e31311d4f0a425f5c"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.298885 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" event={"ID":"c5bf2447-4038-4736-999a-f5b87905a99a","Type":"ContainerStarted","Data":"f9f8c51f0a68b3e7215c423ea0688373a683feb8487d0975c1bb00dfa1b699a6"} Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.298628 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.798614344 +0000 UTC m=+145.711585357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.298328 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.315888 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" event={"ID":"8f878d47-ed9f-4f96-a7c0-153c58ff33cf","Type":"ContainerStarted","Data":"933e752132f2a420d784301212554720687de6292e8cbdc538483a6709dc0ae5"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.331880 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" event={"ID":"2254728a-2ece-43b6-8b5f-a16970277d33","Type":"ContainerStarted","Data":"2b20ed4aab5291df3aef72d8aaeb2e89eaecce26bf91b95c16791bdbe2b6022f"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.331926 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" event={"ID":"2254728a-2ece-43b6-8b5f-a16970277d33","Type":"ContainerStarted","Data":"6b2563cfab14c601f608d642f1182e1c231dc9223e349c638e4ceb5a70f250b9"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.345087 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" event={"ID":"80400620-75a2-4d87-ad1a-2ef29346babc","Type":"ContainerStarted","Data":"cc60a502dfbec4f4ba456fa95191d171d2ae59b5aeeb03fc3d8d3da4b57cdcc7"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.345131 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" event={"ID":"80400620-75a2-4d87-ad1a-2ef29346babc","Type":"ContainerStarted","Data":"e708be94fce0be8917f223d36a4a9a54f846f3e31c5406e6ee580b54992a11c5"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.352925 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:09 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:09 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:09 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.353030 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.362434 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-44fvq" podStartSLOduration=125.362419854 podStartE2EDuration="2m5.362419854s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:09.360072288 +0000 UTC m=+145.273043311" watchObservedRunningTime="2025-10-08 18:13:09.362419854 +0000 UTC m=+145.275390867" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.367239 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" event={"ID":"77eda81d-119c-4876-a951-e7262ed136b9","Type":"ContainerStarted","Data":"348fcf05ae06374553d26994bb0678f363fe8db50ed0873f2d18d59423917455"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.381793 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" event={"ID":"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093","Type":"ContainerStarted","Data":"435c1df279078687b0caff1b0a1836a7d388ab425107f7763cae724f3348cadc"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.405010 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.405404 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:09.905373132 +0000 UTC m=+145.818344155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.494580 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" event={"ID":"55eee756-293d-4ca8-91d0-6f95f22e72dc","Type":"ContainerStarted","Data":"c715d399e5d200848ac39826e9ed76d2ae90027b1d34effa635f0880fa68e4ed"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.507235 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.508221 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.008205215 +0000 UTC m=+145.921176228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.522239 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" event={"ID":"77a8f6e2-2a14-4876-b61c-4f9eb7581f90","Type":"ContainerStarted","Data":"29c7b14638f6629234f972ee1c76fc016fba28c47e08317487b37392fc9640fe"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.539757 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" event={"ID":"e522b21c-5710-4f2e-bff5-b76226c88d2f","Type":"ContainerStarted","Data":"984cc9a5538b2d11e1c53312b7b561397abf125d77ad904bd7092b017db878c6"} Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.540960 4750 patch_prober.go:28] interesting pod/downloads-7954f5f757-jw49l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.541012 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jw49l" podUID="8d1619a1-f1c5-455f-814e-0cff00e053c0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.541145 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jw49l" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.570630 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fzfgf" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.589051 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.608698 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.610511 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.110496077 +0000 UTC m=+146.023467090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.712737 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.713021 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.213008804 +0000 UTC m=+146.125979817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.814881 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.815036 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.315007038 +0000 UTC m=+146.227978051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.815537 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.815849 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.315841128 +0000 UTC m=+146.228812141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:09 crc kubenswrapper[4750]: I1008 18:13:09.919264 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:09 crc kubenswrapper[4750]: E1008 18:13:09.919588 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.419571724 +0000 UTC m=+146.332542737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.020480 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.020819 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.52080541 +0000 UTC m=+146.433776423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.121660 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.122057 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.622007787 +0000 UTC m=+146.534978800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.122653 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.122991 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.622973499 +0000 UTC m=+146.535944512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.224165 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.224784 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.724767549 +0000 UTC m=+146.637738562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.329811 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.330148 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.830134393 +0000 UTC m=+146.743105406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.345207 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:10 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:10 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:10 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.345563 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.431069 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.431377 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.931316439 +0000 UTC m=+146.844287452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.432010 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.432501 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:10.932492946 +0000 UTC m=+146.845463959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.532830 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.533202 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.03317822 +0000 UTC m=+146.946149233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.546220 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" event={"ID":"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60","Type":"ContainerStarted","Data":"04281d0c3c4141480f468e1f6d9e11c469590732ead7d0057624047edabd40b9"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.549218 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" event={"ID":"988fa55b-c750-4c47-aaf6-fb602e6477d7","Type":"ContainerStarted","Data":"eb47534ad71eccbe9173f8cda0b3ae22d075421bc6de95f8b7138b73ac118fe4"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.552198 4750 generic.go:334] "Generic (PLEG): container finished" podID="77eda81d-119c-4876-a951-e7262ed136b9" containerID="215538a3ff7d7f3c31ffe79ad3ddeb96742f81d39334dc84fb00db4973bc5642" exitCode=0 Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.552236 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" event={"ID":"77eda81d-119c-4876-a951-e7262ed136b9","Type":"ContainerDied","Data":"215538a3ff7d7f3c31ffe79ad3ddeb96742f81d39334dc84fb00db4973bc5642"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.552262 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" event={"ID":"77eda81d-119c-4876-a951-e7262ed136b9","Type":"ContainerStarted","Data":"ecdf88d25ec09441765e731088d475ffa2f4800a3d5de987627337d7f16576ca"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.552272 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" event={"ID":"77eda81d-119c-4876-a951-e7262ed136b9","Type":"ContainerStarted","Data":"9502b614e43c6d2b515ac12468c5587ea474e0286eaad7e0f15463f79d459da4"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.554612 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" event={"ID":"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3","Type":"ContainerStarted","Data":"9d3e573b9f48ac1a430b6009805ebd323427d7e731de780259137f6a189e5b47"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.554644 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" event={"ID":"1c3d5788-4162-4ca6-b3d4-80d5d3062ea3","Type":"ContainerStarted","Data":"c532a4540e755c497cdf6369eed09255e76e45d818e1d7679a10e434608aeecf"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.556237 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" event={"ID":"e522b21c-5710-4f2e-bff5-b76226c88d2f","Type":"ContainerStarted","Data":"a6e2ca19a57f96d663b33ddf623b7bcc9b56e75874e947523e5fdfcd469fe83f"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.556355 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.558185 4750 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-glbv9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.558291 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" podUID="e522b21c-5710-4f2e-bff5-b76226c88d2f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.559238 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" event={"ID":"c5bf2447-4038-4736-999a-f5b87905a99a","Type":"ContainerStarted","Data":"0026f3067f9134d75418a548d9dd29adb1516238745238c5a3d81d0d0c1beff3"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.561738 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" event={"ID":"19b57874-b782-45bf-a257-3a162023a242","Type":"ContainerStarted","Data":"8fc4dd91d91adbb2d24faed3dfa630420bac8f6a31611d3b7d5c20668c2a0916"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.564533 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" event={"ID":"79a05c13-3cd6-48ce-ba2f-6cfdf1a51093","Type":"ContainerStarted","Data":"acd993d8fcbe04706580b5adb21eb0f597999f71b89fdfbea1a08388b60f368c"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.566816 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" event={"ID":"55eee756-293d-4ca8-91d0-6f95f22e72dc","Type":"ContainerStarted","Data":"8e8b48a3228337402d2528ed51ea00b5b3f21ec561e493d843b81db90ecff329"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.570866 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" event={"ID":"9fa6043a-28e1-436e-9d43-f4f37c431457","Type":"ContainerStarted","Data":"99f01645f4a31cf13d3851cbc9781fea161b82984d324faa3ed2e5cdcf1b58a7"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.571047 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.573159 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" event={"ID":"4227b4db-94b2-4e5f-a231-aa0b6bbe685c","Type":"ContainerStarted","Data":"6b18ab44745452e81dd04f4fa6f42e21f155140ec9d7b446a428b84b4d6af79e"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.575127 4750 generic.go:334] "Generic (PLEG): container finished" podID="8f878d47-ed9f-4f96-a7c0-153c58ff33cf" containerID="c008fab2986065aa8b1e33cc4c79eaabdd31fb8791f9a7ed5974399c3bed44a2" exitCode=0 Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.575195 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" event={"ID":"8f878d47-ed9f-4f96-a7c0-153c58ff33cf","Type":"ContainerDied","Data":"c008fab2986065aa8b1e33cc4c79eaabdd31fb8791f9a7ed5974399c3bed44a2"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.582872 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.587077 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" event={"ID":"77a8f6e2-2a14-4876-b61c-4f9eb7581f90","Type":"ContainerStarted","Data":"963c0cc3e0232c66684382ec4387e87117a4a05197d76b9c5ad39f8ad7791734"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.589498 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n87fc" event={"ID":"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8","Type":"ContainerStarted","Data":"686b46ef0c7bef6beabc7aa629d69aaae355ffac6dec54982e9d96482cb92b94"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.603086 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" podStartSLOduration=126.603050944 podStartE2EDuration="2m6.603050944s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:10.597529953 +0000 UTC m=+146.510500966" watchObservedRunningTime="2025-10-08 18:13:10.603050944 +0000 UTC m=+146.516021977" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.604817 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" event={"ID":"ce206052-32b9-4632-a7d7-4bd6630a6fd5","Type":"ContainerStarted","Data":"da3302c298e7f93634bc9c6dedff27f1a2d96c00c2500e63b7ddc1c88dfab719"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.604867 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" event={"ID":"ce206052-32b9-4632-a7d7-4bd6630a6fd5","Type":"ContainerStarted","Data":"870e141b3f70cc82cdc5b107d2b47c9e55cce8cf4845c64989b6380195e6d3f3"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.605329 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.607862 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wtkzp" event={"ID":"59a547a6-ce7d-41ea-a985-0e477deec34a","Type":"ContainerStarted","Data":"74463ffd2c8bf84d4f79fd7b960f00500a56c72b49d8dbf4bdd11103fd3825bf"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.614752 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" event={"ID":"884c29cd-b721-400c-b319-510f191f02dd","Type":"ContainerStarted","Data":"4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.615643 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.616923 4750 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hg6dd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.616960 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" podUID="884c29cd-b721-400c-b319-510f191f02dd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.619213 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" event={"ID":"80400620-75a2-4d87-ad1a-2ef29346babc","Type":"ContainerStarted","Data":"7af753d488ecfcd235bcc421477eddb252ab09e393e5809d771a1cb192f4c6b6"} Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.623073 4750 patch_prober.go:28] interesting pod/downloads-7954f5f757-jw49l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.623164 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jw49l" podUID="8d1619a1-f1c5-455f-814e-0cff00e053c0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.632926 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l6dqx" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.633856 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.637334 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.137312265 +0000 UTC m=+147.050283278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.733279 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" podStartSLOduration=127.733245586 podStartE2EDuration="2m7.733245586s" podCreationTimestamp="2025-10-08 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:10.69031176 +0000 UTC m=+146.603282793" watchObservedRunningTime="2025-10-08 18:13:10.733245586 +0000 UTC m=+146.646216609" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.740254 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fsv6k" podStartSLOduration=126.740228881 podStartE2EDuration="2m6.740228881s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:10.7300234 +0000 UTC m=+146.642994433" watchObservedRunningTime="2025-10-08 18:13:10.740228881 +0000 UTC m=+146.653199904" Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.745330 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.245310232 +0000 UTC m=+147.158281235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.745361 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.745688 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.747953 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.247933544 +0000 UTC m=+147.160904557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.847965 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.848494 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.348473764 +0000 UTC m=+147.261444777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.853648 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kjc7k" podStartSLOduration=126.853633196 podStartE2EDuration="2m6.853633196s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:10.806330056 +0000 UTC m=+146.719301069" watchObservedRunningTime="2025-10-08 18:13:10.853633196 +0000 UTC m=+146.766604209" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.907101 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2dhb9" podStartSLOduration=126.907077681 podStartE2EDuration="2m6.907077681s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:10.900450225 +0000 UTC m=+146.813421238" watchObservedRunningTime="2025-10-08 18:13:10.907077681 +0000 UTC m=+146.820048684" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.907843 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xz2wz" podStartSLOduration=126.907838519 podStartE2EDuration="2m6.907838519s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:10.865968338 +0000 UTC m=+146.778939351" watchObservedRunningTime="2025-10-08 18:13:10.907838519 +0000 UTC m=+146.820809532" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.952112 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:10 crc kubenswrapper[4750]: E1008 18:13:10.953102 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.45308267 +0000 UTC m=+147.366053683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.955108 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" podStartSLOduration=126.955085048 podStartE2EDuration="2m6.955085048s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:10.953174182 +0000 UTC m=+146.866145195" watchObservedRunningTime="2025-10-08 18:13:10.955085048 +0000 UTC m=+146.868056071" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.983637 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:10 crc kubenswrapper[4750]: I1008 18:13:10.983956 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.014680 4750 patch_prober.go:28] interesting pod/apiserver-76f77b778f-vwt7h container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.014746 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" podUID="77eda81d-119c-4876-a951-e7262ed136b9" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.054302 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.054811 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.554791248 +0000 UTC m=+147.467762261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.110146 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" podStartSLOduration=127.110130238 podStartE2EDuration="2m7.110130238s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.108228713 +0000 UTC m=+147.021199726" watchObservedRunningTime="2025-10-08 18:13:11.110130238 +0000 UTC m=+147.023101251" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.111518 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wrzwf" podStartSLOduration=127.11151249 podStartE2EDuration="2m7.11151249s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.033968825 +0000 UTC m=+146.946939858" watchObservedRunningTime="2025-10-08 18:13:11.11151249 +0000 UTC m=+147.024483503" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.156030 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.156465 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.656449744 +0000 UTC m=+147.569420757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.191470 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-xj8d6" podStartSLOduration=127.191440523 podStartE2EDuration="2m7.191440523s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.166489683 +0000 UTC m=+147.079460696" watchObservedRunningTime="2025-10-08 18:13:11.191440523 +0000 UTC m=+147.104411536" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.192463 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkkbr" podStartSLOduration=127.192457067 podStartE2EDuration="2m7.192457067s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.19043883 +0000 UTC m=+147.103409843" watchObservedRunningTime="2025-10-08 18:13:11.192457067 +0000 UTC m=+147.105428100" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.261376 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.261998 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.761968643 +0000 UTC m=+147.674939656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.262378 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.262859 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.762844313 +0000 UTC m=+147.675815326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.289577 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hsr6t" podStartSLOduration=127.289541556 podStartE2EDuration="2m7.289541556s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.221259099 +0000 UTC m=+147.134230112" watchObservedRunningTime="2025-10-08 18:13:11.289541556 +0000 UTC m=+147.202512569" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.339875 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" podStartSLOduration=127.339858186 podStartE2EDuration="2m7.339858186s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.289260868 +0000 UTC m=+147.202231871" watchObservedRunningTime="2025-10-08 18:13:11.339858186 +0000 UTC m=+147.252829199" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.340756 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wtkzp" podStartSLOduration=8.340750368 podStartE2EDuration="8.340750368s" podCreationTimestamp="2025-10-08 18:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.339116479 +0000 UTC m=+147.252087512" watchObservedRunningTime="2025-10-08 18:13:11.340750368 +0000 UTC m=+147.253721381" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.350294 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:11 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:11 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:11 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.350366 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.364275 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.364433 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.864408198 +0000 UTC m=+147.777379211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.364583 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.364897 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.864889429 +0000 UTC m=+147.777860442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.463938 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" podStartSLOduration=127.463918463 podStartE2EDuration="2m7.463918463s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.463245218 +0000 UTC m=+147.376216221" watchObservedRunningTime="2025-10-08 18:13:11.463918463 +0000 UTC m=+147.376889476" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.465295 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rdtt7" podStartSLOduration=127.465288826 podStartE2EDuration="2m7.465288826s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.436900924 +0000 UTC m=+147.349871947" watchObservedRunningTime="2025-10-08 18:13:11.465288826 +0000 UTC m=+147.378259839" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.466021 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.466209 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.966180757 +0000 UTC m=+147.879151770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.466491 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.466966 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:11.966948345 +0000 UTC m=+147.879919368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.567444 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.567619 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.067595108 +0000 UTC m=+147.980566121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.567789 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.568145 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.068137421 +0000 UTC m=+147.981108434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.590241 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzx59" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.639180 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n87fc" event={"ID":"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8","Type":"ContainerStarted","Data":"9c418eae1817349d997e9d83f7b846d1229b1b1e77e20b1a8c51058ea5b73582"} Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.641877 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" event={"ID":"8f878d47-ed9f-4f96-a7c0-153c58ff33cf","Type":"ContainerStarted","Data":"032d0ce26a5390cfe7b9e4c3442eb570c4dc9dfcea7c4559aaa347c9f39707ee"} Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.644941 4750 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hg6dd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.644993 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" podUID="884c29cd-b721-400c-b319-510f191f02dd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.671286 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.671725 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.171705802 +0000 UTC m=+148.084676825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.693317 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" podStartSLOduration=127.693297354 podStartE2EDuration="2m7.693297354s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:11.692343331 +0000 UTC m=+147.605314364" watchObservedRunningTime="2025-10-08 18:13:11.693297354 +0000 UTC m=+147.606268387" Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.773307 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.775251 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.275230003 +0000 UTC m=+148.188201206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.874328 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.874984 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.374967815 +0000 UTC m=+148.287938828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:11 crc kubenswrapper[4750]: I1008 18:13:11.976675 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:11 crc kubenswrapper[4750]: E1008 18:13:11.977234 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.477208145 +0000 UTC m=+148.390179158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.078711 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.079135 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.579113977 +0000 UTC m=+148.492084990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.180891 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.181339 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.681325566 +0000 UTC m=+148.594296579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.282448 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.283022 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.783000494 +0000 UTC m=+148.695971507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.345283 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:12 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:12 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:12 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.345359 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.385001 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.385637 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.885591932 +0000 UTC m=+148.798562945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.486014 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.486293 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.986244025 +0000 UTC m=+148.899215038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.486520 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.486844 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:12.986828879 +0000 UTC m=+148.899799892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.519785 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mhkml"] Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.521026 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.523360 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.529229 4750 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.544025 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhkml"] Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.587999 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.588278 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:13.088240369 +0000 UTC m=+149.001211382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.588646 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.588702 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.588755 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.588810 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.589268 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:13.089257914 +0000 UTC m=+149.002228927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.592301 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.598672 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.599258 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.642920 4750 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-glbv9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.642994 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" podUID="e522b21c-5710-4f2e-bff5-b76226c88d2f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.649035 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n87fc" event={"ID":"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8","Type":"ContainerStarted","Data":"950d2aeccf741070c13f1cca54fdfba305f9d3d30fbbb94a64fede7f2a064c27"} Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.649084 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n87fc" event={"ID":"2b48fe6f-87c9-43fc-aa46-4c2fa7e62fe8","Type":"ContainerStarted","Data":"cacaa885214ca6fcad05bb473ef0a4ff73314f1cb8e96843c7ab8feedf0177b7"} Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.656837 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.689839 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.690069 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:13.190028509 +0000 UTC m=+149.102999522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.690149 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94fsm\" (UniqueName: \"kubernetes.io/projected/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-kube-api-access-94fsm\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.690395 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.690448 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-utilities\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.690528 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.690724 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-catalog-content\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.690882 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:13.190863089 +0000 UTC m=+149.103834102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.710125 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.714702 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-n87fc" podStartSLOduration=9.714677972 podStartE2EDuration="9.714677972s" podCreationTimestamp="2025-10-08 18:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:12.697960077 +0000 UTC m=+148.610931100" watchObservedRunningTime="2025-10-08 18:13:12.714677972 +0000 UTC m=+148.627648985" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.716410 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d966q"] Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.717850 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.720682 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.740961 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d966q"] Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.746663 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.761787 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.792431 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.792816 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-utilities\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.792942 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-catalog-content\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.793010 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-catalog-content\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.793144 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqsl6\" (UniqueName: \"kubernetes.io/projected/7a3b8246-69d2-405e-a414-c21b9cb3b31d-kube-api-access-zqsl6\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.793214 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-utilities\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.793292 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94fsm\" (UniqueName: \"kubernetes.io/projected/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-kube-api-access-94fsm\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.794110 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:13.294091113 +0000 UTC m=+149.207062126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.794540 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-utilities\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.796318 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-catalog-content\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.819717 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94fsm\" (UniqueName: \"kubernetes.io/projected/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-kube-api-access-94fsm\") pod \"community-operators-mhkml\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.833340 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.847750 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.894415 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqsl6\" (UniqueName: \"kubernetes.io/projected/7a3b8246-69d2-405e-a414-c21b9cb3b31d-kube-api-access-zqsl6\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.894467 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-utilities\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.894522 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.894577 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-catalog-content\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.894988 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-catalog-content\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: E1008 18:13:12.895007 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 18:13:13.394991321 +0000 UTC m=+149.307962334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2cxc5" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.895071 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-utilities\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.901144 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g59fz"] Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.902347 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.962918 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g59fz"] Oct 08 18:13:12 crc kubenswrapper[4750]: I1008 18:13:12.963239 4750 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-08T18:13:12.529256843Z","Handler":null,"Name":""} Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.018368 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.018868 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-catalog-content\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.018924 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdlmg\" (UniqueName: \"kubernetes.io/projected/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-kube-api-access-pdlmg\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.019067 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-utilities\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: E1008 18:13:13.019251 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 18:13:13.519223872 +0000 UTC m=+149.432194885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.020438 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqsl6\" (UniqueName: \"kubernetes.io/projected/7a3b8246-69d2-405e-a414-c21b9cb3b31d-kube-api-access-zqsl6\") pod \"certified-operators-d966q\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.035306 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.072042 4750 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.072081 4750 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.118103 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxt5g"] Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.122283 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-utilities\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.122310 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-catalog-content\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.122337 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdlmg\" (UniqueName: \"kubernetes.io/projected/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-kube-api-access-pdlmg\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.122373 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.122993 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-utilities\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.123206 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-catalog-content\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.124066 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.158041 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.158083 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.162725 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdlmg\" (UniqueName: \"kubernetes.io/projected/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-kube-api-access-pdlmg\") pod \"community-operators-g59fz\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.189171 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxt5g"] Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.225305 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-catalog-content\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.225373 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vfrd\" (UniqueName: \"kubernetes.io/projected/6dd4fe2f-ea7e-4a82-b868-375ec700de66-kube-api-access-2vfrd\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.225446 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-utilities\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.226973 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.244879 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2cxc5\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.266485 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.267264 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.272482 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.272659 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.280839 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.326058 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.326280 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-catalog-content\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.326311 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vfrd\" (UniqueName: \"kubernetes.io/projected/6dd4fe2f-ea7e-4a82-b868-375ec700de66-kube-api-access-2vfrd\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.326357 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-utilities\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.326773 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-utilities\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.327071 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-catalog-content\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.349927 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vfrd\" (UniqueName: \"kubernetes.io/projected/6dd4fe2f-ea7e-4a82-b868-375ec700de66-kube-api-access-2vfrd\") pod \"certified-operators-sxt5g\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.350413 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:13 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:13 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:13 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.350467 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.369900 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.422039 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.430010 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/348584fd-e515-4d6b-8476-fdfe7cb562a4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"348584fd-e515-4d6b-8476-fdfe7cb562a4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.430053 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/348584fd-e515-4d6b-8476-fdfe7cb562a4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"348584fd-e515-4d6b-8476-fdfe7cb562a4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.466644 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:13:13 crc kubenswrapper[4750]: W1008 18:13:13.480919 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-aae647bf6cf863c00b943e7cd7177d187d93812f97d7330716d427ac39d4cf8b WatchSource:0}: Error finding container aae647bf6cf863c00b943e7cd7177d187d93812f97d7330716d427ac39d4cf8b: Status 404 returned error can't find the container with id aae647bf6cf863c00b943e7cd7177d187d93812f97d7330716d427ac39d4cf8b Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.507233 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d966q"] Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.531300 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/348584fd-e515-4d6b-8476-fdfe7cb562a4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"348584fd-e515-4d6b-8476-fdfe7cb562a4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.531346 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/348584fd-e515-4d6b-8476-fdfe7cb562a4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"348584fd-e515-4d6b-8476-fdfe7cb562a4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.531419 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/348584fd-e515-4d6b-8476-fdfe7cb562a4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"348584fd-e515-4d6b-8476-fdfe7cb562a4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:13 crc kubenswrapper[4750]: W1008 18:13:13.536672 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3b8246_69d2_405e_a414_c21b9cb3b31d.slice/crio-35a8a4afb9645c4fafa3eb57d19a93c0da01582b4b80f7bcab62d45b5cec5f6c WatchSource:0}: Error finding container 35a8a4afb9645c4fafa3eb57d19a93c0da01582b4b80f7bcab62d45b5cec5f6c: Status 404 returned error can't find the container with id 35a8a4afb9645c4fafa3eb57d19a93c0da01582b4b80f7bcab62d45b5cec5f6c Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.569774 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/348584fd-e515-4d6b-8476-fdfe7cb562a4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"348584fd-e515-4d6b-8476-fdfe7cb562a4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.571908 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhkml"] Oct 08 18:13:13 crc kubenswrapper[4750]: W1008 18:13:13.593366 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158d74a9_bbc8_4cd9_9507_eb477aa3a5a9.slice/crio-6f31845f1acc1c240b1c1b1751759a7a20093283610b710cfcf724a4243197dd WatchSource:0}: Error finding container 6f31845f1acc1c240b1c1b1751759a7a20093283610b710cfcf724a4243197dd: Status 404 returned error can't find the container with id 6f31845f1acc1c240b1c1b1751759a7a20093283610b710cfcf724a4243197dd Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.607694 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.661113 4750 generic.go:334] "Generic (PLEG): container finished" podID="4227b4db-94b2-4e5f-a231-aa0b6bbe685c" containerID="6b18ab44745452e81dd04f4fa6f42e21f155140ec9d7b446a428b84b4d6af79e" exitCode=0 Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.661218 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" event={"ID":"4227b4db-94b2-4e5f-a231-aa0b6bbe685c","Type":"ContainerDied","Data":"6b18ab44745452e81dd04f4fa6f42e21f155140ec9d7b446a428b84b4d6af79e"} Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.670653 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhkml" event={"ID":"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9","Type":"ContainerStarted","Data":"6f31845f1acc1c240b1c1b1751759a7a20093283610b710cfcf724a4243197dd"} Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.672206 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d966q" event={"ID":"7a3b8246-69d2-405e-a414-c21b9cb3b31d","Type":"ContainerStarted","Data":"35a8a4afb9645c4fafa3eb57d19a93c0da01582b4b80f7bcab62d45b5cec5f6c"} Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.674013 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aae647bf6cf863c00b943e7cd7177d187d93812f97d7330716d427ac39d4cf8b"} Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.676852 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"624c03a40e2ad17bd4c66ccfd456d71ccdcfdd36f409c30cb72822b24111b0ba"} Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.676888 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2be982ce14ce0ab9d1ea15d6d817f9ea2a0a703dfec1b2bd3ec359d6fd9841c6"} Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.678991 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f37e3afb2070abad6f80e979444ea46410ed0c788e85c06853b67922a0ced484"} Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.850751 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g59fz"] Oct 08 18:13:13 crc kubenswrapper[4750]: I1008 18:13:13.993035 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.052856 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxt5g"] Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.059966 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2cxc5"] Oct 08 18:13:14 crc kubenswrapper[4750]: W1008 18:13:14.157482 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2508573_4890_48b6_9119_93560ee4c5d9.slice/crio-57364b7dacfb9035c397438ead6b5118634ce30aa73e9551175634af65b84589 WatchSource:0}: Error finding container 57364b7dacfb9035c397438ead6b5118634ce30aa73e9551175634af65b84589: Status 404 returned error can't find the container with id 57364b7dacfb9035c397438ead6b5118634ce30aa73e9551175634af65b84589 Oct 08 18:13:14 crc kubenswrapper[4750]: W1008 18:13:14.157775 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd4fe2f_ea7e_4a82_b868_375ec700de66.slice/crio-f80e2f9cf28bf5b48fbbb5c410b8a7ff5996c138999224c09f89725cf262e1f2 WatchSource:0}: Error finding container f80e2f9cf28bf5b48fbbb5c410b8a7ff5996c138999224c09f89725cf262e1f2: Status 404 returned error can't find the container with id f80e2f9cf28bf5b48fbbb5c410b8a7ff5996c138999224c09f89725cf262e1f2 Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.345976 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:14 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:14 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:14 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.346384 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.498196 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gpm4k"] Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.499509 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.501428 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.550825 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpm4k"] Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.662514 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-utilities\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.662584 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxtp\" (UniqueName: \"kubernetes.io/projected/57f5ce4d-cf29-4d65-aa17-f23c042a2602-kube-api-access-psxtp\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.662640 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-catalog-content\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.686981 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"348584fd-e515-4d6b-8476-fdfe7cb562a4","Type":"ContainerStarted","Data":"7dafbe2bca0d29e463a6261ef1191c9fc8d3a3d01f491ed732a1fcf25f6c8ddd"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.687031 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"348584fd-e515-4d6b-8476-fdfe7cb562a4","Type":"ContainerStarted","Data":"5bf52f296732e94606f002250e60917fe6b67f21360753d58e70159bedc325a7"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.688785 4750 generic.go:334] "Generic (PLEG): container finished" podID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerID="26053d35d305487354ca5045e53912c890b03f9eaa942522bbdc38571da40cdb" exitCode=0 Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.688840 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhkml" event={"ID":"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9","Type":"ContainerDied","Data":"26053d35d305487354ca5045e53912c890b03f9eaa942522bbdc38571da40cdb"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.690468 4750 generic.go:334] "Generic (PLEG): container finished" podID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerID="6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d" exitCode=0 Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.690527 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d966q" event={"ID":"7a3b8246-69d2-405e-a414-c21b9cb3b31d","Type":"ContainerDied","Data":"6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.692287 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.693028 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0bc426d847e5c557d74f6121ea1a104d820481d79642584ae007735404621235"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.693604 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.695505 4750 generic.go:334] "Generic (PLEG): container finished" podID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerID="12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060" exitCode=0 Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.695581 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt5g" event={"ID":"6dd4fe2f-ea7e-4a82-b868-375ec700de66","Type":"ContainerDied","Data":"12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.695640 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt5g" event={"ID":"6dd4fe2f-ea7e-4a82-b868-375ec700de66","Type":"ContainerStarted","Data":"f80e2f9cf28bf5b48fbbb5c410b8a7ff5996c138999224c09f89725cf262e1f2"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.697382 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" event={"ID":"e2508573-4890-48b6-9119-93560ee4c5d9","Type":"ContainerStarted","Data":"e3d9f8f9b1fb9e1fa49bbb7b7d63a207e853509d90d0de48fa49cf9b5ec305ae"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.697416 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" event={"ID":"e2508573-4890-48b6-9119-93560ee4c5d9","Type":"ContainerStarted","Data":"57364b7dacfb9035c397438ead6b5118634ce30aa73e9551175634af65b84589"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.697507 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.698875 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8777b4872383608c46102bcc9edca92006a6b2f6842f0eb2cc9a3d08caa9e6ab"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.700903 4750 generic.go:334] "Generic (PLEG): container finished" podID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerID="f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5" exitCode=0 Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.701060 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59fz" event={"ID":"e457a3fa-6164-4127-8a2d-ea7ea42e6da4","Type":"ContainerDied","Data":"f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.701173 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59fz" event={"ID":"e457a3fa-6164-4127-8a2d-ea7ea42e6da4","Type":"ContainerStarted","Data":"279922cee161442094ad06e5fbd001556e28e11dad9bc773e06abf9505004fd5"} Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.730778 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.730758179 podStartE2EDuration="1.730758179s" podCreationTimestamp="2025-10-08 18:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:14.706429234 +0000 UTC m=+150.619400267" watchObservedRunningTime="2025-10-08 18:13:14.730758179 +0000 UTC m=+150.643729192" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.747851 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.758020 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" podStartSLOduration=130.757985324 podStartE2EDuration="2m10.757985324s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:14.752947415 +0000 UTC m=+150.665918468" watchObservedRunningTime="2025-10-08 18:13:14.757985324 +0000 UTC m=+150.670956337" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.764142 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-utilities\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.764212 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxtp\" (UniqueName: \"kubernetes.io/projected/57f5ce4d-cf29-4d65-aa17-f23c042a2602-kube-api-access-psxtp\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.764244 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-catalog-content\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.764673 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-catalog-content\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.764881 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-utilities\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.787500 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxtp\" (UniqueName: \"kubernetes.io/projected/57f5ce4d-cf29-4d65-aa17-f23c042a2602-kube-api-access-psxtp\") pod \"redhat-marketplace-gpm4k\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.813016 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.926116 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lrm8v"] Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.928050 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:14 crc kubenswrapper[4750]: I1008 18:13:14.939226 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrm8v"] Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.055254 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.073185 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-utilities\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.073264 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq6xj\" (UniqueName: \"kubernetes.io/projected/e9739a0d-d23a-4a49-8fe4-6257a3948210-kube-api-access-hq6xj\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.073333 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-catalog-content\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.175291 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-829vf\" (UniqueName: \"kubernetes.io/projected/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-kube-api-access-829vf\") pod \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.175709 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-config-volume\") pod \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.175759 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-secret-volume\") pod \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\" (UID: \"4227b4db-94b2-4e5f-a231-aa0b6bbe685c\") " Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.175965 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-catalog-content\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.176022 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-utilities\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.176079 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq6xj\" (UniqueName: \"kubernetes.io/projected/e9739a0d-d23a-4a49-8fe4-6257a3948210-kube-api-access-hq6xj\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.181184 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-config-volume" (OuterVolumeSpecName: "config-volume") pod "4227b4db-94b2-4e5f-a231-aa0b6bbe685c" (UID: "4227b4db-94b2-4e5f-a231-aa0b6bbe685c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.181648 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-catalog-content\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.181920 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-utilities\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.184509 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4227b4db-94b2-4e5f-a231-aa0b6bbe685c" (UID: "4227b4db-94b2-4e5f-a231-aa0b6bbe685c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.196520 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-kube-api-access-829vf" (OuterVolumeSpecName: "kube-api-access-829vf") pod "4227b4db-94b2-4e5f-a231-aa0b6bbe685c" (UID: "4227b4db-94b2-4e5f-a231-aa0b6bbe685c"). InnerVolumeSpecName "kube-api-access-829vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.199231 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq6xj\" (UniqueName: \"kubernetes.io/projected/e9739a0d-d23a-4a49-8fe4-6257a3948210-kube-api-access-hq6xj\") pod \"redhat-marketplace-lrm8v\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.277254 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-829vf\" (UniqueName: \"kubernetes.io/projected/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-kube-api-access-829vf\") on node \"crc\" DevicePath \"\"" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.277289 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.277298 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4227b4db-94b2-4e5f-a231-aa0b6bbe685c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.339362 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.346013 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:15 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:15 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:15 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.346078 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.387748 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpm4k"] Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.539543 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrm8v"] Oct 08 18:13:15 crc kubenswrapper[4750]: W1008 18:13:15.552311 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9739a0d_d23a_4a49_8fe4_6257a3948210.slice/crio-f53140f9f55fddd38f1ba7835d563807df40a7580a87e6a7d88e9dda0d3f7b65 WatchSource:0}: Error finding container f53140f9f55fddd38f1ba7835d563807df40a7580a87e6a7d88e9dda0d3f7b65: Status 404 returned error can't find the container with id f53140f9f55fddd38f1ba7835d563807df40a7580a87e6a7d88e9dda0d3f7b65 Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.585837 4750 patch_prober.go:28] interesting pod/downloads-7954f5f757-jw49l container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.585885 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jw49l" podUID="8d1619a1-f1c5-455f-814e-0cff00e053c0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.585909 4750 patch_prober.go:28] interesting pod/downloads-7954f5f757-jw49l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.585957 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jw49l" podUID="8d1619a1-f1c5-455f-814e-0cff00e053c0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.730573 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" event={"ID":"4227b4db-94b2-4e5f-a231-aa0b6bbe685c","Type":"ContainerDied","Data":"c32ad658728639322e0ec7bf39719dc9f200b613ee715ada152065440ad53293"} Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.730610 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.730654 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c32ad658728639322e0ec7bf39719dc9f200b613ee715ada152065440ad53293" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.735867 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrm8v" event={"ID":"e9739a0d-d23a-4a49-8fe4-6257a3948210","Type":"ContainerStarted","Data":"f53140f9f55fddd38f1ba7835d563807df40a7580a87e6a7d88e9dda0d3f7b65"} Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.741695 4750 generic.go:334] "Generic (PLEG): container finished" podID="348584fd-e515-4d6b-8476-fdfe7cb562a4" containerID="7dafbe2bca0d29e463a6261ef1191c9fc8d3a3d01f491ed732a1fcf25f6c8ddd" exitCode=0 Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.741778 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"348584fd-e515-4d6b-8476-fdfe7cb562a4","Type":"ContainerDied","Data":"7dafbe2bca0d29e463a6261ef1191c9fc8d3a3d01f491ed732a1fcf25f6c8ddd"} Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.746695 4750 generic.go:334] "Generic (PLEG): container finished" podID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerID="e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2" exitCode=0 Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.746787 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpm4k" event={"ID":"57f5ce4d-cf29-4d65-aa17-f23c042a2602","Type":"ContainerDied","Data":"e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2"} Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.746835 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpm4k" event={"ID":"57f5ce4d-cf29-4d65-aa17-f23c042a2602","Type":"ContainerStarted","Data":"9db0775e990729dd8d700fed93c810afec78115bfeda69f77db0f3a2e3ff8f81"} Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.864490 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.864595 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.868704 4750 patch_prober.go:28] interesting pod/console-f9d7485db-wprnh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.868755 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wprnh" podUID="7c3552dc-a0cf-4072-91e1-030803f6014d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" Oct 08 18:13:15 crc kubenswrapper[4750]: E1008 18:13:15.889920 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9739a0d_d23a_4a49_8fe4_6257a3948210.slice/crio-02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4227b4db_94b2_4e5f_a231_aa0b6bbe685c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9739a0d_d23a_4a49_8fe4_6257a3948210.slice/crio-conmon-02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4227b4db_94b2_4e5f_a231_aa0b6bbe685c.slice/crio-c32ad658728639322e0ec7bf39719dc9f200b613ee715ada152065440ad53293\": RecentStats: unable to find data in memory cache]" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.896655 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d24c6"] Oct 08 18:13:15 crc kubenswrapper[4750]: E1008 18:13:15.896873 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4227b4db-94b2-4e5f-a231-aa0b6bbe685c" containerName="collect-profiles" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.896886 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4227b4db-94b2-4e5f-a231-aa0b6bbe685c" containerName="collect-profiles" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.896988 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4227b4db-94b2-4e5f-a231-aa0b6bbe685c" containerName="collect-profiles" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.897727 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.908113 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.927531 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d24c6"] Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.985815 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-utilities\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.985863 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmz7j\" (UniqueName: \"kubernetes.io/projected/c7f5e091-d02e-45aa-bc15-58841948bcd6-kube-api-access-bmz7j\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.985907 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-catalog-content\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.991412 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.992378 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.992635 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:15 crc kubenswrapper[4750]: I1008 18:13:15.998829 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vwt7h" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.000946 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.097296 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-utilities\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.097367 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmz7j\" (UniqueName: \"kubernetes.io/projected/c7f5e091-d02e-45aa-bc15-58841948bcd6-kube-api-access-bmz7j\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.097434 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-catalog-content\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.100173 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-utilities\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.107068 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-catalog-content\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.145575 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmz7j\" (UniqueName: \"kubernetes.io/projected/c7f5e091-d02e-45aa-bc15-58841948bcd6-kube-api-access-bmz7j\") pod \"redhat-operators-d24c6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.247045 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.299296 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xv4f7"] Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.345909 4750 patch_prober.go:28] interesting pod/router-default-5444994796-tdftm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 18:13:16 crc kubenswrapper[4750]: [-]has-synced failed: reason withheld Oct 08 18:13:16 crc kubenswrapper[4750]: [+]process-running ok Oct 08 18:13:16 crc kubenswrapper[4750]: healthz check failed Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.345979 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tdftm" podUID="08bc2767-2a02-4bc1-b3cd-47670db11792" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.383801 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xv4f7"] Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.383873 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.383916 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-glbv9" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.383989 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.504237 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-utilities\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.504364 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-catalog-content\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.504385 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsgbw\" (UniqueName: \"kubernetes.io/projected/d4d14217-c4c4-4268-9f28-3ff28ff9b095-kube-api-access-xsgbw\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.605532 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-utilities\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.605629 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-catalog-content\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.605653 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsgbw\" (UniqueName: \"kubernetes.io/projected/d4d14217-c4c4-4268-9f28-3ff28ff9b095-kube-api-access-xsgbw\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.606253 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-catalog-content\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.609780 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d24c6"] Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.614752 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-utilities\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.623391 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsgbw\" (UniqueName: \"kubernetes.io/projected/d4d14217-c4c4-4268-9f28-3ff28ff9b095-kube-api-access-xsgbw\") pod \"redhat-operators-xv4f7\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.699753 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.756004 4750 generic.go:334] "Generic (PLEG): container finished" podID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerID="02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30" exitCode=0 Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.756075 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrm8v" event={"ID":"e9739a0d-d23a-4a49-8fe4-6257a3948210","Type":"ContainerDied","Data":"02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30"} Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.758953 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d24c6" event={"ID":"c7f5e091-d02e-45aa-bc15-58841948bcd6","Type":"ContainerStarted","Data":"3f0a507d8bfc00255263057823d679173972e475544e7e2b3ac7295a82421573"} Oct 08 18:13:16 crc kubenswrapper[4750]: I1008 18:13:16.763923 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5g5nc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.105703 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.120488 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 18:13:17 crc kubenswrapper[4750]: E1008 18:13:17.122648 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348584fd-e515-4d6b-8476-fdfe7cb562a4" containerName="pruner" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.122728 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="348584fd-e515-4d6b-8476-fdfe7cb562a4" containerName="pruner" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.122877 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="348584fd-e515-4d6b-8476-fdfe7cb562a4" containerName="pruner" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.124059 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.128119 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.128321 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.134417 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xv4f7"] Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.143453 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.225240 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/348584fd-e515-4d6b-8476-fdfe7cb562a4-kube-api-access\") pod \"348584fd-e515-4d6b-8476-fdfe7cb562a4\" (UID: \"348584fd-e515-4d6b-8476-fdfe7cb562a4\") " Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.225302 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/348584fd-e515-4d6b-8476-fdfe7cb562a4-kubelet-dir\") pod \"348584fd-e515-4d6b-8476-fdfe7cb562a4\" (UID: \"348584fd-e515-4d6b-8476-fdfe7cb562a4\") " Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.225493 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78f32507-fe14-474c-9e4b-ad45685df6f6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78f32507-fe14-474c-9e4b-ad45685df6f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.225529 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78f32507-fe14-474c-9e4b-ad45685df6f6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78f32507-fe14-474c-9e4b-ad45685df6f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.226973 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/348584fd-e515-4d6b-8476-fdfe7cb562a4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "348584fd-e515-4d6b-8476-fdfe7cb562a4" (UID: "348584fd-e515-4d6b-8476-fdfe7cb562a4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.233490 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348584fd-e515-4d6b-8476-fdfe7cb562a4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "348584fd-e515-4d6b-8476-fdfe7cb562a4" (UID: "348584fd-e515-4d6b-8476-fdfe7cb562a4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.326396 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78f32507-fe14-474c-9e4b-ad45685df6f6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78f32507-fe14-474c-9e4b-ad45685df6f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.326524 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78f32507-fe14-474c-9e4b-ad45685df6f6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78f32507-fe14-474c-9e4b-ad45685df6f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.326762 4750 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/348584fd-e515-4d6b-8476-fdfe7cb562a4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.326787 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/348584fd-e515-4d6b-8476-fdfe7cb562a4-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.326485 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78f32507-fe14-474c-9e4b-ad45685df6f6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78f32507-fe14-474c-9e4b-ad45685df6f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.344516 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.348627 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tdftm" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.356803 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78f32507-fe14-474c-9e4b-ad45685df6f6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78f32507-fe14-474c-9e4b-ad45685df6f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.462311 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.735843 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.770161 4750 generic.go:334] "Generic (PLEG): container finished" podID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerID="1d8dbdda642bfb5003c76831e0d0db760f7b52773b0a8b618fbfa922a888467b" exitCode=0 Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.770532 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv4f7" event={"ID":"d4d14217-c4c4-4268-9f28-3ff28ff9b095","Type":"ContainerDied","Data":"1d8dbdda642bfb5003c76831e0d0db760f7b52773b0a8b618fbfa922a888467b"} Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.770586 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv4f7" event={"ID":"d4d14217-c4c4-4268-9f28-3ff28ff9b095","Type":"ContainerStarted","Data":"8ce0555ed15faa120533f23deead70aad0c58d783b793f2875c1bff2004af2b7"} Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.776351 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.776464 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"348584fd-e515-4d6b-8476-fdfe7cb562a4","Type":"ContainerDied","Data":"5bf52f296732e94606f002250e60917fe6b67f21360753d58e70159bedc325a7"} Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.776498 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bf52f296732e94606f002250e60917fe6b67f21360753d58e70159bedc325a7" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.891345 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-w8527_4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60/cluster-samples-operator/0.log" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.891594 4750 generic.go:334] "Generic (PLEG): container finished" podID="4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60" containerID="6e08366fbf47e362f69bc912a211fff2501d666e218fe9dbda0bae1af8aebe10" exitCode=2 Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.891708 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" event={"ID":"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60","Type":"ContainerDied","Data":"6e08366fbf47e362f69bc912a211fff2501d666e218fe9dbda0bae1af8aebe10"} Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.892250 4750 scope.go:117] "RemoveContainer" containerID="6e08366fbf47e362f69bc912a211fff2501d666e218fe9dbda0bae1af8aebe10" Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.899129 4750 generic.go:334] "Generic (PLEG): container finished" podID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerID="c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f" exitCode=0 Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.899422 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d24c6" event={"ID":"c7f5e091-d02e-45aa-bc15-58841948bcd6","Type":"ContainerDied","Data":"c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f"} Oct 08 18:13:17 crc kubenswrapper[4750]: I1008 18:13:17.959206 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n57nw" Oct 08 18:13:18 crc kubenswrapper[4750]: I1008 18:13:18.920963 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78f32507-fe14-474c-9e4b-ad45685df6f6","Type":"ContainerStarted","Data":"e86bcd6aef7fe226d11e2ecd950c9b828bbc23b0cae84801a38a576785be4c3f"} Oct 08 18:13:18 crc kubenswrapper[4750]: I1008 18:13:18.921378 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78f32507-fe14-474c-9e4b-ad45685df6f6","Type":"ContainerStarted","Data":"f038d2b0c2d0178c7cec92c0b6d2cbadc60ca14a56b0aa2c8dc7146c566242ba"} Oct 08 18:13:18 crc kubenswrapper[4750]: I1008 18:13:18.926413 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-w8527_4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60/cluster-samples-operator/0.log" Oct 08 18:13:18 crc kubenswrapper[4750]: I1008 18:13:18.927017 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8527" event={"ID":"4e73ade2-f6bf-4f9b-98da-1b6a2c9a7d60","Type":"ContainerStarted","Data":"e413170566ce4a0b25756e7d5e2d51cde8ddd7f3274ef350b0588338eb98240a"} Oct 08 18:13:18 crc kubenswrapper[4750]: I1008 18:13:18.938538 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.9385205810000001 podStartE2EDuration="1.938520581s" podCreationTimestamp="2025-10-08 18:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:18.9346973 +0000 UTC m=+154.847668313" watchObservedRunningTime="2025-10-08 18:13:18.938520581 +0000 UTC m=+154.851491584" Oct 08 18:13:19 crc kubenswrapper[4750]: I1008 18:13:19.951414 4750 generic.go:334] "Generic (PLEG): container finished" podID="78f32507-fe14-474c-9e4b-ad45685df6f6" containerID="e86bcd6aef7fe226d11e2ecd950c9b828bbc23b0cae84801a38a576785be4c3f" exitCode=0 Oct 08 18:13:19 crc kubenswrapper[4750]: I1008 18:13:19.951509 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78f32507-fe14-474c-9e4b-ad45685df6f6","Type":"ContainerDied","Data":"e86bcd6aef7fe226d11e2ecd950c9b828bbc23b0cae84801a38a576785be4c3f"} Oct 08 18:13:25 crc kubenswrapper[4750]: I1008 18:13:25.589836 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jw49l" Oct 08 18:13:25 crc kubenswrapper[4750]: I1008 18:13:25.869465 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:25 crc kubenswrapper[4750]: I1008 18:13:25.873215 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:13:26 crc kubenswrapper[4750]: I1008 18:13:26.171177 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:13:26 crc kubenswrapper[4750]: I1008 18:13:26.194793 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b67ae9d5-e575-45e7-913a-01f379b86416-metrics-certs\") pod \"network-metrics-daemon-7f9jd\" (UID: \"b67ae9d5-e575-45e7-913a-01f379b86416\") " pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:13:26 crc kubenswrapper[4750]: I1008 18:13:26.253750 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7f9jd" Oct 08 18:13:29 crc kubenswrapper[4750]: I1008 18:13:29.706906 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:13:29 crc kubenswrapper[4750]: I1008 18:13:29.707289 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:13:33 crc kubenswrapper[4750]: I1008 18:13:33.429436 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:13:38 crc kubenswrapper[4750]: I1008 18:13:38.688768 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:38 crc kubenswrapper[4750]: I1008 18:13:38.747210 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78f32507-fe14-474c-9e4b-ad45685df6f6-kube-api-access\") pod \"78f32507-fe14-474c-9e4b-ad45685df6f6\" (UID: \"78f32507-fe14-474c-9e4b-ad45685df6f6\") " Oct 08 18:13:38 crc kubenswrapper[4750]: I1008 18:13:38.747312 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78f32507-fe14-474c-9e4b-ad45685df6f6-kubelet-dir\") pod \"78f32507-fe14-474c-9e4b-ad45685df6f6\" (UID: \"78f32507-fe14-474c-9e4b-ad45685df6f6\") " Oct 08 18:13:38 crc kubenswrapper[4750]: I1008 18:13:38.747529 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78f32507-fe14-474c-9e4b-ad45685df6f6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "78f32507-fe14-474c-9e4b-ad45685df6f6" (UID: "78f32507-fe14-474c-9e4b-ad45685df6f6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:13:38 crc kubenswrapper[4750]: I1008 18:13:38.747682 4750 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78f32507-fe14-474c-9e4b-ad45685df6f6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 18:13:38 crc kubenswrapper[4750]: I1008 18:13:38.753150 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f32507-fe14-474c-9e4b-ad45685df6f6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "78f32507-fe14-474c-9e4b-ad45685df6f6" (UID: "78f32507-fe14-474c-9e4b-ad45685df6f6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:13:38 crc kubenswrapper[4750]: I1008 18:13:38.848420 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78f32507-fe14-474c-9e4b-ad45685df6f6-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 18:13:39 crc kubenswrapper[4750]: I1008 18:13:39.142466 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78f32507-fe14-474c-9e4b-ad45685df6f6","Type":"ContainerDied","Data":"f038d2b0c2d0178c7cec92c0b6d2cbadc60ca14a56b0aa2c8dc7146c566242ba"} Oct 08 18:13:39 crc kubenswrapper[4750]: I1008 18:13:39.142508 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f038d2b0c2d0178c7cec92c0b6d2cbadc60ca14a56b0aa2c8dc7146c566242ba" Oct 08 18:13:39 crc kubenswrapper[4750]: I1008 18:13:39.142578 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 18:13:39 crc kubenswrapper[4750]: E1008 18:13:39.322774 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 18:13:39 crc kubenswrapper[4750]: E1008 18:13:39.323003 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-psxtp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gpm4k_openshift-marketplace(57f5ce4d-cf29-4d65-aa17-f23c042a2602): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 18:13:39 crc kubenswrapper[4750]: E1008 18:13:39.324388 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gpm4k" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" Oct 08 18:13:42 crc kubenswrapper[4750]: E1008 18:13:42.091951 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gpm4k" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" Oct 08 18:13:43 crc kubenswrapper[4750]: E1008 18:13:43.209815 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 18:13:43 crc kubenswrapper[4750]: E1008 18:13:43.210018 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vfrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sxt5g_openshift-marketplace(6dd4fe2f-ea7e-4a82-b868-375ec700de66): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 18:13:43 crc kubenswrapper[4750]: E1008 18:13:43.211250 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sxt5g" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.289971 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sxt5g" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.362885 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.363246 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pdlmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-g59fz_openshift-marketplace(e457a3fa-6164-4127-8a2d-ea7ea42e6da4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.364753 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-g59fz" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.414831 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.415100 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqsl6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-d966q_openshift-marketplace(7a3b8246-69d2-405e-a414-c21b9cb3b31d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.416406 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-d966q" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.438510 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.438716 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hq6xj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lrm8v_openshift-marketplace(e9739a0d-d23a-4a49-8fe4-6257a3948210): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.440222 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lrm8v" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.451800 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.451986 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-94fsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mhkml_openshift-marketplace(158d74a9-bbc8-4cd9-9507-eb477aa3a5a9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 18:13:44 crc kubenswrapper[4750]: E1008 18:13:44.453341 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mhkml" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" Oct 08 18:13:44 crc kubenswrapper[4750]: I1008 18:13:44.694277 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7f9jd"] Oct 08 18:13:44 crc kubenswrapper[4750]: W1008 18:13:44.700507 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67ae9d5_e575_45e7_913a_01f379b86416.slice/crio-df19cff2844abb72eec8c7bb9a685dd15bfaa58cc8792196e88f9afb8d22e549 WatchSource:0}: Error finding container df19cff2844abb72eec8c7bb9a685dd15bfaa58cc8792196e88f9afb8d22e549: Status 404 returned error can't find the container with id df19cff2844abb72eec8c7bb9a685dd15bfaa58cc8792196e88f9afb8d22e549 Oct 08 18:13:45 crc kubenswrapper[4750]: I1008 18:13:45.177258 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" event={"ID":"b67ae9d5-e575-45e7-913a-01f379b86416","Type":"ContainerStarted","Data":"8a4029717421248363bd63543fff66d8e41a49c6c7ac25441c166ad723fefce3"} Oct 08 18:13:45 crc kubenswrapper[4750]: I1008 18:13:45.177707 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" event={"ID":"b67ae9d5-e575-45e7-913a-01f379b86416","Type":"ContainerStarted","Data":"df19cff2844abb72eec8c7bb9a685dd15bfaa58cc8792196e88f9afb8d22e549"} Oct 08 18:13:45 crc kubenswrapper[4750]: I1008 18:13:45.179447 4750 generic.go:334] "Generic (PLEG): container finished" podID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerID="88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721" exitCode=0 Oct 08 18:13:45 crc kubenswrapper[4750]: I1008 18:13:45.179511 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d24c6" event={"ID":"c7f5e091-d02e-45aa-bc15-58841948bcd6","Type":"ContainerDied","Data":"88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721"} Oct 08 18:13:45 crc kubenswrapper[4750]: I1008 18:13:45.181762 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv4f7" event={"ID":"d4d14217-c4c4-4268-9f28-3ff28ff9b095","Type":"ContainerStarted","Data":"3d55d1860ee61636abdf9f67cd63a64445ce2652f3816a15edfd2a7054f77dbc"} Oct 08 18:13:45 crc kubenswrapper[4750]: E1008 18:13:45.183496 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-d966q" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" Oct 08 18:13:45 crc kubenswrapper[4750]: E1008 18:13:45.183679 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mhkml" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" Oct 08 18:13:45 crc kubenswrapper[4750]: E1008 18:13:45.183786 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-g59fz" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" Oct 08 18:13:45 crc kubenswrapper[4750]: E1008 18:13:45.184311 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lrm8v" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" Oct 08 18:13:46 crc kubenswrapper[4750]: I1008 18:13:46.190867 4750 generic.go:334] "Generic (PLEG): container finished" podID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerID="3d55d1860ee61636abdf9f67cd63a64445ce2652f3816a15edfd2a7054f77dbc" exitCode=0 Oct 08 18:13:46 crc kubenswrapper[4750]: I1008 18:13:46.190986 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv4f7" event={"ID":"d4d14217-c4c4-4268-9f28-3ff28ff9b095","Type":"ContainerDied","Data":"3d55d1860ee61636abdf9f67cd63a64445ce2652f3816a15edfd2a7054f77dbc"} Oct 08 18:13:46 crc kubenswrapper[4750]: I1008 18:13:46.194162 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7f9jd" event={"ID":"b67ae9d5-e575-45e7-913a-01f379b86416","Type":"ContainerStarted","Data":"d145587c371f3a1f72372b92f9f27a408c469838c5679151dfb463dd1498213a"} Oct 08 18:13:46 crc kubenswrapper[4750]: I1008 18:13:46.282460 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5ff9w" Oct 08 18:13:46 crc kubenswrapper[4750]: I1008 18:13:46.304208 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7f9jd" podStartSLOduration=162.30419304 podStartE2EDuration="2m42.30419304s" podCreationTimestamp="2025-10-08 18:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:13:46.237117352 +0000 UTC m=+182.150088375" watchObservedRunningTime="2025-10-08 18:13:46.30419304 +0000 UTC m=+182.217164053" Oct 08 18:13:47 crc kubenswrapper[4750]: I1008 18:13:47.200170 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv4f7" event={"ID":"d4d14217-c4c4-4268-9f28-3ff28ff9b095","Type":"ContainerStarted","Data":"9e0495eb96731b30d579e8c1f6a0209c4175e746067875021ac2d10345ed08d1"} Oct 08 18:13:47 crc kubenswrapper[4750]: I1008 18:13:47.202389 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d24c6" event={"ID":"c7f5e091-d02e-45aa-bc15-58841948bcd6","Type":"ContainerStarted","Data":"c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2"} Oct 08 18:13:47 crc kubenswrapper[4750]: I1008 18:13:47.216991 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xv4f7" podStartSLOduration=2.227308743 podStartE2EDuration="31.216976548s" podCreationTimestamp="2025-10-08 18:13:16 +0000 UTC" firstStartedPulling="2025-10-08 18:13:17.773445339 +0000 UTC m=+153.686416362" lastFinishedPulling="2025-10-08 18:13:46.763113134 +0000 UTC m=+182.676084167" observedRunningTime="2025-10-08 18:13:47.215142975 +0000 UTC m=+183.128113998" watchObservedRunningTime="2025-10-08 18:13:47.216976548 +0000 UTC m=+183.129947561" Oct 08 18:13:47 crc kubenswrapper[4750]: I1008 18:13:47.229583 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d24c6" podStartSLOduration=4.105039853 podStartE2EDuration="32.229566256s" podCreationTimestamp="2025-10-08 18:13:15 +0000 UTC" firstStartedPulling="2025-10-08 18:13:17.925784866 +0000 UTC m=+153.838755879" lastFinishedPulling="2025-10-08 18:13:46.050311279 +0000 UTC m=+181.963282282" observedRunningTime="2025-10-08 18:13:47.229386312 +0000 UTC m=+183.142357355" watchObservedRunningTime="2025-10-08 18:13:47.229566256 +0000 UTC m=+183.142537269" Oct 08 18:13:52 crc kubenswrapper[4750]: I1008 18:13:52.852149 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 18:13:56 crc kubenswrapper[4750]: I1008 18:13:56.248799 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:56 crc kubenswrapper[4750]: I1008 18:13:56.249474 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:56 crc kubenswrapper[4750]: I1008 18:13:56.700402 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:56 crc kubenswrapper[4750]: I1008 18:13:56.700470 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:56 crc kubenswrapper[4750]: I1008 18:13:56.703773 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:56 crc kubenswrapper[4750]: I1008 18:13:56.744196 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:13:56 crc kubenswrapper[4750]: I1008 18:13:56.753312 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:57 crc kubenswrapper[4750]: I1008 18:13:57.311079 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:13:57 crc kubenswrapper[4750]: I1008 18:13:57.933865 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xv4f7"] Oct 08 18:13:59 crc kubenswrapper[4750]: I1008 18:13:59.278605 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xv4f7" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerName="registry-server" containerID="cri-o://9e0495eb96731b30d579e8c1f6a0209c4175e746067875021ac2d10345ed08d1" gracePeriod=2 Oct 08 18:13:59 crc kubenswrapper[4750]: I1008 18:13:59.706626 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:13:59 crc kubenswrapper[4750]: I1008 18:13:59.707059 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.284219 4750 generic.go:334] "Generic (PLEG): container finished" podID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerID="9e0495eb96731b30d579e8c1f6a0209c4175e746067875021ac2d10345ed08d1" exitCode=0 Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.284271 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv4f7" event={"ID":"d4d14217-c4c4-4268-9f28-3ff28ff9b095","Type":"ContainerDied","Data":"9e0495eb96731b30d579e8c1f6a0209c4175e746067875021ac2d10345ed08d1"} Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.284296 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xv4f7" event={"ID":"d4d14217-c4c4-4268-9f28-3ff28ff9b095","Type":"ContainerDied","Data":"8ce0555ed15faa120533f23deead70aad0c58d783b793f2875c1bff2004af2b7"} Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.284306 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ce0555ed15faa120533f23deead70aad0c58d783b793f2875c1bff2004af2b7" Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.285591 4750 generic.go:334] "Generic (PLEG): container finished" podID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerID="a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9" exitCode=0 Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.285616 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpm4k" event={"ID":"57f5ce4d-cf29-4d65-aa17-f23c042a2602","Type":"ContainerDied","Data":"a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9"} Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.288605 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.323131 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsgbw\" (UniqueName: \"kubernetes.io/projected/d4d14217-c4c4-4268-9f28-3ff28ff9b095-kube-api-access-xsgbw\") pod \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.323205 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-catalog-content\") pod \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.323254 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-utilities\") pod \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\" (UID: \"d4d14217-c4c4-4268-9f28-3ff28ff9b095\") " Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.324759 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-utilities" (OuterVolumeSpecName: "utilities") pod "d4d14217-c4c4-4268-9f28-3ff28ff9b095" (UID: "d4d14217-c4c4-4268-9f28-3ff28ff9b095"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.329848 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d14217-c4c4-4268-9f28-3ff28ff9b095-kube-api-access-xsgbw" (OuterVolumeSpecName: "kube-api-access-xsgbw") pod "d4d14217-c4c4-4268-9f28-3ff28ff9b095" (UID: "d4d14217-c4c4-4268-9f28-3ff28ff9b095"). InnerVolumeSpecName "kube-api-access-xsgbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.403583 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4d14217-c4c4-4268-9f28-3ff28ff9b095" (UID: "d4d14217-c4c4-4268-9f28-3ff28ff9b095"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.424769 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.424792 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d14217-c4c4-4268-9f28-3ff28ff9b095-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:00 crc kubenswrapper[4750]: I1008 18:14:00.424803 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsgbw\" (UniqueName: \"kubernetes.io/projected/d4d14217-c4c4-4268-9f28-3ff28ff9b095-kube-api-access-xsgbw\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:01 crc kubenswrapper[4750]: I1008 18:14:01.291074 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xv4f7" Oct 08 18:14:01 crc kubenswrapper[4750]: I1008 18:14:01.309339 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xv4f7"] Oct 08 18:14:01 crc kubenswrapper[4750]: I1008 18:14:01.312260 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xv4f7"] Oct 08 18:14:02 crc kubenswrapper[4750]: I1008 18:14:02.740729 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" path="/var/lib/kubelet/pods/d4d14217-c4c4-4268-9f28-3ff28ff9b095/volumes" Oct 08 18:14:03 crc kubenswrapper[4750]: I1008 18:14:03.301376 4750 generic.go:334] "Generic (PLEG): container finished" podID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerID="ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929" exitCode=0 Oct 08 18:14:03 crc kubenswrapper[4750]: I1008 18:14:03.301405 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrm8v" event={"ID":"e9739a0d-d23a-4a49-8fe4-6257a3948210","Type":"ContainerDied","Data":"ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929"} Oct 08 18:14:03 crc kubenswrapper[4750]: I1008 18:14:03.302716 4750 generic.go:334] "Generic (PLEG): container finished" podID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerID="f2959ab4a0ab540db14046db17a204910d3c6a3d33b91b7e042920f311139411" exitCode=0 Oct 08 18:14:03 crc kubenswrapper[4750]: I1008 18:14:03.302754 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhkml" event={"ID":"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9","Type":"ContainerDied","Data":"f2959ab4a0ab540db14046db17a204910d3c6a3d33b91b7e042920f311139411"} Oct 08 18:14:03 crc kubenswrapper[4750]: I1008 18:14:03.305809 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpm4k" event={"ID":"57f5ce4d-cf29-4d65-aa17-f23c042a2602","Type":"ContainerStarted","Data":"de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20"} Oct 08 18:14:03 crc kubenswrapper[4750]: I1008 18:14:03.345034 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gpm4k" podStartSLOduration=2.5110745469999998 podStartE2EDuration="49.345016371s" podCreationTimestamp="2025-10-08 18:13:14 +0000 UTC" firstStartedPulling="2025-10-08 18:13:15.748262617 +0000 UTC m=+151.661233630" lastFinishedPulling="2025-10-08 18:14:02.582204441 +0000 UTC m=+198.495175454" observedRunningTime="2025-10-08 18:14:03.342185981 +0000 UTC m=+199.255157014" watchObservedRunningTime="2025-10-08 18:14:03.345016371 +0000 UTC m=+199.257987384" Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.311956 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhkml" event={"ID":"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9","Type":"ContainerStarted","Data":"c2994da194a0aec8c1f131b849075d8d0e601bcfb43fe1bcffc306b471f4ef2a"} Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.317810 4750 generic.go:334] "Generic (PLEG): container finished" podID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerID="fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8" exitCode=0 Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.317896 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d966q" event={"ID":"7a3b8246-69d2-405e-a414-c21b9cb3b31d","Type":"ContainerDied","Data":"fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8"} Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.321209 4750 generic.go:334] "Generic (PLEG): container finished" podID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerID="46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847" exitCode=0 Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.321273 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt5g" event={"ID":"6dd4fe2f-ea7e-4a82-b868-375ec700de66","Type":"ContainerDied","Data":"46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847"} Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.323532 4750 generic.go:334] "Generic (PLEG): container finished" podID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerID="3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba" exitCode=0 Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.323602 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59fz" event={"ID":"e457a3fa-6164-4127-8a2d-ea7ea42e6da4","Type":"ContainerDied","Data":"3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba"} Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.326042 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrm8v" event={"ID":"e9739a0d-d23a-4a49-8fe4-6257a3948210","Type":"ContainerStarted","Data":"6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb"} Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.337229 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mhkml" podStartSLOduration=3.03294347 podStartE2EDuration="52.337210972s" podCreationTimestamp="2025-10-08 18:13:12 +0000 UTC" firstStartedPulling="2025-10-08 18:13:14.692078944 +0000 UTC m=+150.605049967" lastFinishedPulling="2025-10-08 18:14:03.996346456 +0000 UTC m=+199.909317469" observedRunningTime="2025-10-08 18:14:04.334302279 +0000 UTC m=+200.247273302" watchObservedRunningTime="2025-10-08 18:14:04.337210972 +0000 UTC m=+200.250181985" Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.369681 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lrm8v" podStartSLOduration=3.039577184 podStartE2EDuration="50.369662126s" podCreationTimestamp="2025-10-08 18:13:14 +0000 UTC" firstStartedPulling="2025-10-08 18:13:16.759873346 +0000 UTC m=+152.672844359" lastFinishedPulling="2025-10-08 18:14:04.089958288 +0000 UTC m=+200.002929301" observedRunningTime="2025-10-08 18:14:04.366784455 +0000 UTC m=+200.279755478" watchObservedRunningTime="2025-10-08 18:14:04.369662126 +0000 UTC m=+200.282633159" Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.813680 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.814064 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:14:04 crc kubenswrapper[4750]: I1008 18:14:04.868460 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.332157 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59fz" event={"ID":"e457a3fa-6164-4127-8a2d-ea7ea42e6da4","Type":"ContainerStarted","Data":"f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65"} Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.334449 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d966q" event={"ID":"7a3b8246-69d2-405e-a414-c21b9cb3b31d","Type":"ContainerStarted","Data":"850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37"} Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.336268 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt5g" event={"ID":"6dd4fe2f-ea7e-4a82-b868-375ec700de66","Type":"ContainerStarted","Data":"9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1"} Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.339856 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.339901 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.351441 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g59fz" podStartSLOduration=3.042601616 podStartE2EDuration="53.351424477s" podCreationTimestamp="2025-10-08 18:13:12 +0000 UTC" firstStartedPulling="2025-10-08 18:13:14.702431629 +0000 UTC m=+150.615402662" lastFinishedPulling="2025-10-08 18:14:05.01125451 +0000 UTC m=+200.924225523" observedRunningTime="2025-10-08 18:14:05.346420064 +0000 UTC m=+201.259391087" watchObservedRunningTime="2025-10-08 18:14:05.351424477 +0000 UTC m=+201.264395500" Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.369937 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxt5g" podStartSLOduration=2.247170856 podStartE2EDuration="52.369918777s" podCreationTimestamp="2025-10-08 18:13:13 +0000 UTC" firstStartedPulling="2025-10-08 18:13:14.697384289 +0000 UTC m=+150.610355322" lastFinishedPulling="2025-10-08 18:14:04.82013223 +0000 UTC m=+200.733103243" observedRunningTime="2025-10-08 18:14:05.36968246 +0000 UTC m=+201.282653483" watchObservedRunningTime="2025-10-08 18:14:05.369918777 +0000 UTC m=+201.282889790" Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.385001 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:14:05 crc kubenswrapper[4750]: I1008 18:14:05.391567 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d966q" podStartSLOduration=3.2186631549999998 podStartE2EDuration="53.391530292s" podCreationTimestamp="2025-10-08 18:13:12 +0000 UTC" firstStartedPulling="2025-10-08 18:13:14.691942971 +0000 UTC m=+150.604913994" lastFinishedPulling="2025-10-08 18:14:04.864810128 +0000 UTC m=+200.777781131" observedRunningTime="2025-10-08 18:14:05.386339323 +0000 UTC m=+201.299310356" watchObservedRunningTime="2025-10-08 18:14:05.391530292 +0000 UTC m=+201.304501315" Oct 08 18:14:12 crc kubenswrapper[4750]: I1008 18:14:12.833957 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:14:12 crc kubenswrapper[4750]: I1008 18:14:12.834461 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:14:12 crc kubenswrapper[4750]: I1008 18:14:12.875033 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.036728 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.036778 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.071666 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.228080 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.228130 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.265051 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.420027 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.421935 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.432142 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.467769 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.467839 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:14:13 crc kubenswrapper[4750]: I1008 18:14:13.505042 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:14:14 crc kubenswrapper[4750]: I1008 18:14:14.426676 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:14:14 crc kubenswrapper[4750]: I1008 18:14:14.896785 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.133297 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g59fz"] Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.333227 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxt5g"] Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.378519 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.395471 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g59fz" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerName="registry-server" containerID="cri-o://f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65" gracePeriod=2 Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.849746 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.918397 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-catalog-content\") pod \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.918466 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-utilities\") pod \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.918560 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdlmg\" (UniqueName: \"kubernetes.io/projected/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-kube-api-access-pdlmg\") pod \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\" (UID: \"e457a3fa-6164-4127-8a2d-ea7ea42e6da4\") " Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.919115 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-utilities" (OuterVolumeSpecName: "utilities") pod "e457a3fa-6164-4127-8a2d-ea7ea42e6da4" (UID: "e457a3fa-6164-4127-8a2d-ea7ea42e6da4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:14:15 crc kubenswrapper[4750]: I1008 18:14:15.924738 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-kube-api-access-pdlmg" (OuterVolumeSpecName: "kube-api-access-pdlmg") pod "e457a3fa-6164-4127-8a2d-ea7ea42e6da4" (UID: "e457a3fa-6164-4127-8a2d-ea7ea42e6da4"). InnerVolumeSpecName "kube-api-access-pdlmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.020205 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.020245 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdlmg\" (UniqueName: \"kubernetes.io/projected/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-kube-api-access-pdlmg\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.375373 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e457a3fa-6164-4127-8a2d-ea7ea42e6da4" (UID: "e457a3fa-6164-4127-8a2d-ea7ea42e6da4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.401666 4750 generic.go:334] "Generic (PLEG): container finished" podID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerID="f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65" exitCode=0 Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.401751 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g59fz" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.401777 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59fz" event={"ID":"e457a3fa-6164-4127-8a2d-ea7ea42e6da4","Type":"ContainerDied","Data":"f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65"} Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.401851 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g59fz" event={"ID":"e457a3fa-6164-4127-8a2d-ea7ea42e6da4","Type":"ContainerDied","Data":"279922cee161442094ad06e5fbd001556e28e11dad9bc773e06abf9505004fd5"} Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.401872 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxt5g" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerName="registry-server" containerID="cri-o://9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1" gracePeriod=2 Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.401881 4750 scope.go:117] "RemoveContainer" containerID="f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.424345 4750 scope.go:117] "RemoveContainer" containerID="3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.424804 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e457a3fa-6164-4127-8a2d-ea7ea42e6da4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.442751 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g59fz"] Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.445019 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g59fz"] Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.470598 4750 scope.go:117] "RemoveContainer" containerID="f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.482401 4750 scope.go:117] "RemoveContainer" containerID="f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65" Oct 08 18:14:16 crc kubenswrapper[4750]: E1008 18:14:16.482813 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65\": container with ID starting with f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65 not found: ID does not exist" containerID="f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.482847 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65"} err="failed to get container status \"f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65\": rpc error: code = NotFound desc = could not find container \"f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65\": container with ID starting with f3ffad554e20eff5197b42457a02457fd13c68925d3268617ad89f1ae4faec65 not found: ID does not exist" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.482884 4750 scope.go:117] "RemoveContainer" containerID="3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba" Oct 08 18:14:16 crc kubenswrapper[4750]: E1008 18:14:16.483216 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba\": container with ID starting with 3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba not found: ID does not exist" containerID="3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.483255 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba"} err="failed to get container status \"3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba\": rpc error: code = NotFound desc = could not find container \"3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba\": container with ID starting with 3d6f87dd549e5e3baf4c8decf7628d999eab28fc170257afb83066bd01ce9dba not found: ID does not exist" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.483285 4750 scope.go:117] "RemoveContainer" containerID="f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5" Oct 08 18:14:16 crc kubenswrapper[4750]: E1008 18:14:16.483772 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5\": container with ID starting with f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5 not found: ID does not exist" containerID="f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.483797 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5"} err="failed to get container status \"f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5\": rpc error: code = NotFound desc = could not find container \"f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5\": container with ID starting with f49bfe1ee334121dcaa235a7ebe8e647114295d35b897a74c12c47e539d3f6c5 not found: ID does not exist" Oct 08 18:14:16 crc kubenswrapper[4750]: I1008 18:14:16.741483 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" path="/var/lib/kubelet/pods/e457a3fa-6164-4127-8a2d-ea7ea42e6da4/volumes" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.220924 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.337904 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-utilities\") pod \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.337962 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vfrd\" (UniqueName: \"kubernetes.io/projected/6dd4fe2f-ea7e-4a82-b868-375ec700de66-kube-api-access-2vfrd\") pod \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.338081 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-catalog-content\") pod \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\" (UID: \"6dd4fe2f-ea7e-4a82-b868-375ec700de66\") " Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.338895 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-utilities" (OuterVolumeSpecName: "utilities") pod "6dd4fe2f-ea7e-4a82-b868-375ec700de66" (UID: "6dd4fe2f-ea7e-4a82-b868-375ec700de66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.342360 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd4fe2f-ea7e-4a82-b868-375ec700de66-kube-api-access-2vfrd" (OuterVolumeSpecName: "kube-api-access-2vfrd") pod "6dd4fe2f-ea7e-4a82-b868-375ec700de66" (UID: "6dd4fe2f-ea7e-4a82-b868-375ec700de66"). InnerVolumeSpecName "kube-api-access-2vfrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.381140 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dd4fe2f-ea7e-4a82-b868-375ec700de66" (UID: "6dd4fe2f-ea7e-4a82-b868-375ec700de66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.409347 4750 generic.go:334] "Generic (PLEG): container finished" podID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerID="9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1" exitCode=0 Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.409409 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt5g" event={"ID":"6dd4fe2f-ea7e-4a82-b868-375ec700de66","Type":"ContainerDied","Data":"9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1"} Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.409437 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt5g" event={"ID":"6dd4fe2f-ea7e-4a82-b868-375ec700de66","Type":"ContainerDied","Data":"f80e2f9cf28bf5b48fbbb5c410b8a7ff5996c138999224c09f89725cf262e1f2"} Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.409453 4750 scope.go:117] "RemoveContainer" containerID="9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.409570 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxt5g" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.432452 4750 scope.go:117] "RemoveContainer" containerID="46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.434040 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxt5g"] Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.437995 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxt5g"] Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.439412 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vfrd\" (UniqueName: \"kubernetes.io/projected/6dd4fe2f-ea7e-4a82-b868-375ec700de66-kube-api-access-2vfrd\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.439437 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.439452 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd4fe2f-ea7e-4a82-b868-375ec700de66-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.455351 4750 scope.go:117] "RemoveContainer" containerID="12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.466141 4750 scope.go:117] "RemoveContainer" containerID="9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1" Oct 08 18:14:17 crc kubenswrapper[4750]: E1008 18:14:17.466410 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1\": container with ID starting with 9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1 not found: ID does not exist" containerID="9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.466454 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1"} err="failed to get container status \"9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1\": rpc error: code = NotFound desc = could not find container \"9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1\": container with ID starting with 9cbd7aad6a016260bb4057b9f235512a62a22eb3013331100cb174df175eaef1 not found: ID does not exist" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.466481 4750 scope.go:117] "RemoveContainer" containerID="46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847" Oct 08 18:14:17 crc kubenswrapper[4750]: E1008 18:14:17.466805 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847\": container with ID starting with 46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847 not found: ID does not exist" containerID="46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.466832 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847"} err="failed to get container status \"46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847\": rpc error: code = NotFound desc = could not find container \"46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847\": container with ID starting with 46956cca1215682668278e3b1532056ce8f892f144188136e809d38fcee31847 not found: ID does not exist" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.466859 4750 scope.go:117] "RemoveContainer" containerID="12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060" Oct 08 18:14:17 crc kubenswrapper[4750]: E1008 18:14:17.467153 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060\": container with ID starting with 12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060 not found: ID does not exist" containerID="12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.467182 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060"} err="failed to get container status \"12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060\": rpc error: code = NotFound desc = could not find container \"12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060\": container with ID starting with 12dd6680b9f0fd3260ebb76d86a40d6bee6eacf2778877f022d2be8cc9289060 not found: ID does not exist" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.532689 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrm8v"] Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.533115 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lrm8v" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerName="registry-server" containerID="cri-o://6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb" gracePeriod=2 Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.850775 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.947873 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-catalog-content\") pod \"e9739a0d-d23a-4a49-8fe4-6257a3948210\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.947974 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq6xj\" (UniqueName: \"kubernetes.io/projected/e9739a0d-d23a-4a49-8fe4-6257a3948210-kube-api-access-hq6xj\") pod \"e9739a0d-d23a-4a49-8fe4-6257a3948210\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.947995 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-utilities\") pod \"e9739a0d-d23a-4a49-8fe4-6257a3948210\" (UID: \"e9739a0d-d23a-4a49-8fe4-6257a3948210\") " Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.948768 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-utilities" (OuterVolumeSpecName: "utilities") pod "e9739a0d-d23a-4a49-8fe4-6257a3948210" (UID: "e9739a0d-d23a-4a49-8fe4-6257a3948210"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.952150 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9739a0d-d23a-4a49-8fe4-6257a3948210-kube-api-access-hq6xj" (OuterVolumeSpecName: "kube-api-access-hq6xj") pod "e9739a0d-d23a-4a49-8fe4-6257a3948210" (UID: "e9739a0d-d23a-4a49-8fe4-6257a3948210"). InnerVolumeSpecName "kube-api-access-hq6xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:14:17 crc kubenswrapper[4750]: I1008 18:14:17.959804 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9739a0d-d23a-4a49-8fe4-6257a3948210" (UID: "e9739a0d-d23a-4a49-8fe4-6257a3948210"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.048961 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq6xj\" (UniqueName: \"kubernetes.io/projected/e9739a0d-d23a-4a49-8fe4-6257a3948210-kube-api-access-hq6xj\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.048997 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.049007 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9739a0d-d23a-4a49-8fe4-6257a3948210-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.422332 4750 generic.go:334] "Generic (PLEG): container finished" podID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerID="6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb" exitCode=0 Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.422384 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrm8v" event={"ID":"e9739a0d-d23a-4a49-8fe4-6257a3948210","Type":"ContainerDied","Data":"6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb"} Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.422407 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lrm8v" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.422437 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lrm8v" event={"ID":"e9739a0d-d23a-4a49-8fe4-6257a3948210","Type":"ContainerDied","Data":"f53140f9f55fddd38f1ba7835d563807df40a7580a87e6a7d88e9dda0d3f7b65"} Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.422477 4750 scope.go:117] "RemoveContainer" containerID="6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.437104 4750 scope.go:117] "RemoveContainer" containerID="ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.447353 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrm8v"] Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.451301 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lrm8v"] Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.470824 4750 scope.go:117] "RemoveContainer" containerID="02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.482683 4750 scope.go:117] "RemoveContainer" containerID="6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb" Oct 08 18:14:18 crc kubenswrapper[4750]: E1008 18:14:18.482997 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb\": container with ID starting with 6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb not found: ID does not exist" containerID="6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.483038 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb"} err="failed to get container status \"6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb\": rpc error: code = NotFound desc = could not find container \"6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb\": container with ID starting with 6573a6c7f0eaf4f039806ef2733137cd2d7a2ba9dbae7fad254f815a313a35eb not found: ID does not exist" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.483068 4750 scope.go:117] "RemoveContainer" containerID="ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929" Oct 08 18:14:18 crc kubenswrapper[4750]: E1008 18:14:18.483348 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929\": container with ID starting with ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929 not found: ID does not exist" containerID="ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.483369 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929"} err="failed to get container status \"ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929\": rpc error: code = NotFound desc = could not find container \"ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929\": container with ID starting with ad053b47e0931c1de657df50971661cb553ce6dc72c1bff50882cb24e09ea929 not found: ID does not exist" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.483381 4750 scope.go:117] "RemoveContainer" containerID="02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30" Oct 08 18:14:18 crc kubenswrapper[4750]: E1008 18:14:18.483672 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30\": container with ID starting with 02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30 not found: ID does not exist" containerID="02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.483694 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30"} err="failed to get container status \"02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30\": rpc error: code = NotFound desc = could not find container \"02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30\": container with ID starting with 02412be4d72c099b7d82b8e98b4a83d5583a135785864b06508df10d82979c30 not found: ID does not exist" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.741690 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" path="/var/lib/kubelet/pods/6dd4fe2f-ea7e-4a82-b868-375ec700de66/volumes" Oct 08 18:14:18 crc kubenswrapper[4750]: I1008 18:14:18.742494 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" path="/var/lib/kubelet/pods/e9739a0d-d23a-4a49-8fe4-6257a3948210/volumes" Oct 08 18:14:24 crc kubenswrapper[4750]: I1008 18:14:24.777768 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-62m4q"] Oct 08 18:14:29 crc kubenswrapper[4750]: I1008 18:14:29.706975 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:14:29 crc kubenswrapper[4750]: I1008 18:14:29.707235 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:14:29 crc kubenswrapper[4750]: I1008 18:14:29.707287 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:14:29 crc kubenswrapper[4750]: I1008 18:14:29.707777 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:14:29 crc kubenswrapper[4750]: I1008 18:14:29.707836 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f" gracePeriod=600 Oct 08 18:14:30 crc kubenswrapper[4750]: I1008 18:14:30.482435 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f" exitCode=0 Oct 08 18:14:30 crc kubenswrapper[4750]: I1008 18:14:30.482479 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f"} Oct 08 18:14:30 crc kubenswrapper[4750]: I1008 18:14:30.482506 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"66ef4ffb40ab7463261e1244d353dfe197b84351b1cf3ab02fb7a03a2d706a56"} Oct 08 18:14:49 crc kubenswrapper[4750]: I1008 18:14:49.815875 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" podUID="88aafa5f-15c0-43af-80be-1c01d844c9c9" containerName="oauth-openshift" containerID="cri-o://0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8" gracePeriod=15 Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.212835 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241381 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7c4675448c-gg4cw"] Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241592 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerName="extract-utilities" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241604 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerName="extract-utilities" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241614 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerName="extract-utilities" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241622 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerName="extract-utilities" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241632 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241638 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241646 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerName="extract-content" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241651 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerName="extract-content" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241660 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerName="extract-content" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241666 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerName="extract-content" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241677 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f32507-fe14-474c-9e4b-ad45685df6f6" containerName="pruner" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241684 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f32507-fe14-474c-9e4b-ad45685df6f6" containerName="pruner" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241693 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88aafa5f-15c0-43af-80be-1c01d844c9c9" containerName="oauth-openshift" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241699 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="88aafa5f-15c0-43af-80be-1c01d844c9c9" containerName="oauth-openshift" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241710 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241716 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241723 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerName="extract-utilities" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241730 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerName="extract-utilities" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241744 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerName="extract-content" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241753 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerName="extract-content" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241762 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerName="extract-utilities" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241768 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerName="extract-utilities" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241774 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerName="extract-content" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241780 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerName="extract-content" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241789 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241795 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.241803 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241808 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241884 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e457a3fa-6164-4127-8a2d-ea7ea42e6da4" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241895 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="88aafa5f-15c0-43af-80be-1c01d844c9c9" containerName="oauth-openshift" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241902 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d14217-c4c4-4268-9f28-3ff28ff9b095" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241913 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd4fe2f-ea7e-4a82-b868-375ec700de66" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241922 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9739a0d-d23a-4a49-8fe4-6257a3948210" containerName="registry-server" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.241928 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f32507-fe14-474c-9e4b-ad45685df6f6" containerName="pruner" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.242260 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252106 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-service-ca\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252160 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-session\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252192 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjxt8\" (UniqueName: \"kubernetes.io/projected/88aafa5f-15c0-43af-80be-1c01d844c9c9-kube-api-access-tjxt8\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252237 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-policies\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252263 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-dir\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252282 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-error\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252302 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-trusted-ca-bundle\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252328 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-serving-cert\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252347 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-idp-0-file-data\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252367 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-login\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252398 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-provider-selection\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252422 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-cliconfig\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252438 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-ocp-branding-template\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.252458 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-router-certs\") pod \"88aafa5f-15c0-43af-80be-1c01d844c9c9\" (UID: \"88aafa5f-15c0-43af-80be-1c01d844c9c9\") " Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.253233 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.253465 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.258603 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.258628 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.259678 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c4675448c-gg4cw"] Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.260709 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.270197 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88aafa5f-15c0-43af-80be-1c01d844c9c9-kube-api-access-tjxt8" (OuterVolumeSpecName: "kube-api-access-tjxt8") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "kube-api-access-tjxt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.270225 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.270508 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.275194 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.276292 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.286967 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.287421 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.287743 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.288003 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "88aafa5f-15c0-43af-80be-1c01d844c9c9" (UID: "88aafa5f-15c0-43af-80be-1c01d844c9c9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355175 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355317 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-session\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355367 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355438 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355488 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355536 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355627 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z547n\" (UniqueName: \"kubernetes.io/projected/ed493f58-fee2-4d1c-98fc-48d836548ec6-kube-api-access-z547n\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355679 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed493f58-fee2-4d1c-98fc-48d836548ec6-audit-dir\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.355737 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356242 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356311 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356350 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356385 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-audit-policies\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356431 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356587 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356618 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356640 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356660 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356680 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356699 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356719 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjxt8\" (UniqueName: \"kubernetes.io/projected/88aafa5f-15c0-43af-80be-1c01d844c9c9-kube-api-access-tjxt8\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356742 4750 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356767 4750 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88aafa5f-15c0-43af-80be-1c01d844c9c9-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356792 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356815 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356837 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356856 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.356875 4750 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88aafa5f-15c0-43af-80be-1c01d844c9c9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.458476 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.458820 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.458882 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.458977 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-session\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459106 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459226 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459457 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459516 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z547n\" (UniqueName: \"kubernetes.io/projected/ed493f58-fee2-4d1c-98fc-48d836548ec6-kube-api-access-z547n\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459614 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed493f58-fee2-4d1c-98fc-48d836548ec6-audit-dir\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459666 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459779 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459851 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459903 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.459949 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-audit-policies\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.460254 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.460322 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.460362 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed493f58-fee2-4d1c-98fc-48d836548ec6-audit-dir\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.460942 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.461493 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed493f58-fee2-4d1c-98fc-48d836548ec6-audit-policies\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.463000 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-session\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.463125 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.463481 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.463614 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.463908 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.464411 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.465167 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.465218 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ed493f58-fee2-4d1c-98fc-48d836548ec6-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.480284 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z547n\" (UniqueName: \"kubernetes.io/projected/ed493f58-fee2-4d1c-98fc-48d836548ec6-kube-api-access-z547n\") pod \"oauth-openshift-7c4675448c-gg4cw\" (UID: \"ed493f58-fee2-4d1c-98fc-48d836548ec6\") " pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.608634 4750 generic.go:334] "Generic (PLEG): container finished" podID="88aafa5f-15c0-43af-80be-1c01d844c9c9" containerID="0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8" exitCode=0 Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.608686 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" event={"ID":"88aafa5f-15c0-43af-80be-1c01d844c9c9","Type":"ContainerDied","Data":"0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8"} Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.608716 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" event={"ID":"88aafa5f-15c0-43af-80be-1c01d844c9c9","Type":"ContainerDied","Data":"47cb1a78c9333f94b16886ddf51a50dfa2c78a55d94319d9a120ca9a2b5206bc"} Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.608736 4750 scope.go:117] "RemoveContainer" containerID="0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.608736 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-62m4q" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.622460 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.638119 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-62m4q"] Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.639007 4750 scope.go:117] "RemoveContainer" containerID="0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8" Oct 08 18:14:50 crc kubenswrapper[4750]: E1008 18:14:50.639441 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8\": container with ID starting with 0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8 not found: ID does not exist" containerID="0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.639473 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8"} err="failed to get container status \"0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8\": rpc error: code = NotFound desc = could not find container \"0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8\": container with ID starting with 0d605adf3e0bb6b71ae476ac906730201231e451b172b21791cc5c251f40d9c8 not found: ID does not exist" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.642576 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-62m4q"] Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.743411 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88aafa5f-15c0-43af-80be-1c01d844c9c9" path="/var/lib/kubelet/pods/88aafa5f-15c0-43af-80be-1c01d844c9c9/volumes" Oct 08 18:14:50 crc kubenswrapper[4750]: I1008 18:14:50.825427 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c4675448c-gg4cw"] Oct 08 18:14:51 crc kubenswrapper[4750]: I1008 18:14:51.617352 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" event={"ID":"ed493f58-fee2-4d1c-98fc-48d836548ec6","Type":"ContainerStarted","Data":"f07210652e18db2259e1a62f749802690d91f04a9d1f9d3d31fadbd0c95258eb"} Oct 08 18:14:51 crc kubenswrapper[4750]: I1008 18:14:51.617422 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" event={"ID":"ed493f58-fee2-4d1c-98fc-48d836548ec6","Type":"ContainerStarted","Data":"dd5b5dca2104ac32d9ba10834a192b1eb1e41b17121f620341e5391ff26a209e"} Oct 08 18:14:51 crc kubenswrapper[4750]: I1008 18:14:51.617638 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:51 crc kubenswrapper[4750]: I1008 18:14:51.623600 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" Oct 08 18:14:51 crc kubenswrapper[4750]: I1008 18:14:51.647542 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7c4675448c-gg4cw" podStartSLOduration=27.64751658 podStartE2EDuration="27.64751658s" podCreationTimestamp="2025-10-08 18:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:14:51.645587322 +0000 UTC m=+247.558558405" watchObservedRunningTime="2025-10-08 18:14:51.64751658 +0000 UTC m=+247.560487623" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.134719 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl"] Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.136010 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.137940 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.139100 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.145564 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl"] Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.175665 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eefe351d-f07f-47f7-97aa-4733bfd0f000-secret-volume\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.176066 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eefe351d-f07f-47f7-97aa-4733bfd0f000-config-volume\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.176212 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb8cf\" (UniqueName: \"kubernetes.io/projected/eefe351d-f07f-47f7-97aa-4733bfd0f000-kube-api-access-qb8cf\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.277032 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb8cf\" (UniqueName: \"kubernetes.io/projected/eefe351d-f07f-47f7-97aa-4733bfd0f000-kube-api-access-qb8cf\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.277306 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eefe351d-f07f-47f7-97aa-4733bfd0f000-secret-volume\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.277424 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eefe351d-f07f-47f7-97aa-4733bfd0f000-config-volume\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.278262 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eefe351d-f07f-47f7-97aa-4733bfd0f000-config-volume\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.283800 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eefe351d-f07f-47f7-97aa-4733bfd0f000-secret-volume\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.299122 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb8cf\" (UniqueName: \"kubernetes.io/projected/eefe351d-f07f-47f7-97aa-4733bfd0f000-kube-api-access-qb8cf\") pod \"collect-profiles-29332455-spsrl\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.462898 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:00 crc kubenswrapper[4750]: I1008 18:15:00.684855 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl"] Oct 08 18:15:01 crc kubenswrapper[4750]: I1008 18:15:01.673018 4750 generic.go:334] "Generic (PLEG): container finished" podID="eefe351d-f07f-47f7-97aa-4733bfd0f000" containerID="c074cde0c4c91a5fbf7c0332e7757099609651e02b1398cf79e1baeb9bb74d82" exitCode=0 Oct 08 18:15:01 crc kubenswrapper[4750]: I1008 18:15:01.673086 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" event={"ID":"eefe351d-f07f-47f7-97aa-4733bfd0f000","Type":"ContainerDied","Data":"c074cde0c4c91a5fbf7c0332e7757099609651e02b1398cf79e1baeb9bb74d82"} Oct 08 18:15:01 crc kubenswrapper[4750]: I1008 18:15:01.673242 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" event={"ID":"eefe351d-f07f-47f7-97aa-4733bfd0f000","Type":"ContainerStarted","Data":"66a744ad57e582ae4d633c7617689ad98b02dad09c6433e7f194e9f54fe98f2f"} Oct 08 18:15:02 crc kubenswrapper[4750]: I1008 18:15:02.890318 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.005697 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eefe351d-f07f-47f7-97aa-4733bfd0f000-secret-volume\") pod \"eefe351d-f07f-47f7-97aa-4733bfd0f000\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.005787 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb8cf\" (UniqueName: \"kubernetes.io/projected/eefe351d-f07f-47f7-97aa-4733bfd0f000-kube-api-access-qb8cf\") pod \"eefe351d-f07f-47f7-97aa-4733bfd0f000\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.005857 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eefe351d-f07f-47f7-97aa-4733bfd0f000-config-volume\") pod \"eefe351d-f07f-47f7-97aa-4733bfd0f000\" (UID: \"eefe351d-f07f-47f7-97aa-4733bfd0f000\") " Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.006754 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eefe351d-f07f-47f7-97aa-4733bfd0f000-config-volume" (OuterVolumeSpecName: "config-volume") pod "eefe351d-f07f-47f7-97aa-4733bfd0f000" (UID: "eefe351d-f07f-47f7-97aa-4733bfd0f000"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.011265 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eefe351d-f07f-47f7-97aa-4733bfd0f000-kube-api-access-qb8cf" (OuterVolumeSpecName: "kube-api-access-qb8cf") pod "eefe351d-f07f-47f7-97aa-4733bfd0f000" (UID: "eefe351d-f07f-47f7-97aa-4733bfd0f000"). InnerVolumeSpecName "kube-api-access-qb8cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.011456 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eefe351d-f07f-47f7-97aa-4733bfd0f000-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eefe351d-f07f-47f7-97aa-4733bfd0f000" (UID: "eefe351d-f07f-47f7-97aa-4733bfd0f000"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.107091 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eefe351d-f07f-47f7-97aa-4733bfd0f000-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.107126 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb8cf\" (UniqueName: \"kubernetes.io/projected/eefe351d-f07f-47f7-97aa-4733bfd0f000-kube-api-access-qb8cf\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.107136 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eefe351d-f07f-47f7-97aa-4733bfd0f000-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.683418 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" event={"ID":"eefe351d-f07f-47f7-97aa-4733bfd0f000","Type":"ContainerDied","Data":"66a744ad57e582ae4d633c7617689ad98b02dad09c6433e7f194e9f54fe98f2f"} Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.683733 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66a744ad57e582ae4d633c7617689ad98b02dad09c6433e7f194e9f54fe98f2f" Oct 08 18:15:03 crc kubenswrapper[4750]: I1008 18:15:03.683644 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl" Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.883868 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d966q"] Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.884718 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d966q" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerName="registry-server" containerID="cri-o://850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37" gracePeriod=30 Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.918529 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhkml"] Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.919116 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mhkml" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerName="registry-server" containerID="cri-o://c2994da194a0aec8c1f131b849075d8d0e601bcfb43fe1bcffc306b471f4ef2a" gracePeriod=30 Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.931720 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg6dd"] Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.932094 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" podUID="884c29cd-b721-400c-b319-510f191f02dd" containerName="marketplace-operator" containerID="cri-o://4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22" gracePeriod=30 Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.973979 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpm4k"] Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.974215 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gpm4k" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerName="registry-server" containerID="cri-o://de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20" gracePeriod=30 Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.981847 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xspt4"] Oct 08 18:15:06 crc kubenswrapper[4750]: E1008 18:15:06.982098 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eefe351d-f07f-47f7-97aa-4733bfd0f000" containerName="collect-profiles" Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.982115 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="eefe351d-f07f-47f7-97aa-4733bfd0f000" containerName="collect-profiles" Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.982214 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="eefe351d-f07f-47f7-97aa-4733bfd0f000" containerName="collect-profiles" Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.982642 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.988464 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d24c6"] Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.988729 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d24c6" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerName="registry-server" containerID="cri-o://c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2" gracePeriod=30 Oct 08 18:15:06 crc kubenswrapper[4750]: I1008 18:15:06.990069 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xspt4"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.072836 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5cb56e3-bb34-427d-b4c1-7dec95f40023-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.072887 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxr7h\" (UniqueName: \"kubernetes.io/projected/e5cb56e3-bb34-427d-b4c1-7dec95f40023-kube-api-access-hxr7h\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.072991 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5cb56e3-bb34-427d-b4c1-7dec95f40023-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.174267 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5cb56e3-bb34-427d-b4c1-7dec95f40023-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.174628 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxr7h\" (UniqueName: \"kubernetes.io/projected/e5cb56e3-bb34-427d-b4c1-7dec95f40023-kube-api-access-hxr7h\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.174681 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5cb56e3-bb34-427d-b4c1-7dec95f40023-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.176627 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5cb56e3-bb34-427d-b4c1-7dec95f40023-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.180968 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e5cb56e3-bb34-427d-b4c1-7dec95f40023-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.192895 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxr7h\" (UniqueName: \"kubernetes.io/projected/e5cb56e3-bb34-427d-b4c1-7dec95f40023-kube-api-access-hxr7h\") pod \"marketplace-operator-79b997595-xspt4\" (UID: \"e5cb56e3-bb34-427d-b4c1-7dec95f40023\") " pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.239233 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.303339 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.376101 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-utilities\") pod \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.376164 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqsl6\" (UniqueName: \"kubernetes.io/projected/7a3b8246-69d2-405e-a414-c21b9cb3b31d-kube-api-access-zqsl6\") pod \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.376278 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-catalog-content\") pod \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\" (UID: \"7a3b8246-69d2-405e-a414-c21b9cb3b31d\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.377518 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-utilities" (OuterVolumeSpecName: "utilities") pod "7a3b8246-69d2-405e-a414-c21b9cb3b31d" (UID: "7a3b8246-69d2-405e-a414-c21b9cb3b31d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.384511 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3b8246-69d2-405e-a414-c21b9cb3b31d-kube-api-access-zqsl6" (OuterVolumeSpecName: "kube-api-access-zqsl6") pod "7a3b8246-69d2-405e-a414-c21b9cb3b31d" (UID: "7a3b8246-69d2-405e-a414-c21b9cb3b31d"). InnerVolumeSpecName "kube-api-access-zqsl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.392671 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.427903 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.430879 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a3b8246-69d2-405e-a414-c21b9cb3b31d" (UID: "7a3b8246-69d2-405e-a414-c21b9cb3b31d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.437270 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.476742 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884c29cd-b721-400c-b319-510f191f02dd-marketplace-trusted-ca\") pod \"884c29cd-b721-400c-b319-510f191f02dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477050 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/884c29cd-b721-400c-b319-510f191f02dd-marketplace-operator-metrics\") pod \"884c29cd-b721-400c-b319-510f191f02dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477080 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-catalog-content\") pod \"c7f5e091-d02e-45aa-bc15-58841948bcd6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477115 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwhkq\" (UniqueName: \"kubernetes.io/projected/884c29cd-b721-400c-b319-510f191f02dd-kube-api-access-xwhkq\") pod \"884c29cd-b721-400c-b319-510f191f02dd\" (UID: \"884c29cd-b721-400c-b319-510f191f02dd\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477149 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-utilities\") pod \"c7f5e091-d02e-45aa-bc15-58841948bcd6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477183 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-catalog-content\") pod \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477215 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-utilities\") pod \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477242 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxtp\" (UniqueName: \"kubernetes.io/projected/57f5ce4d-cf29-4d65-aa17-f23c042a2602-kube-api-access-psxtp\") pod \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\" (UID: \"57f5ce4d-cf29-4d65-aa17-f23c042a2602\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477283 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmz7j\" (UniqueName: \"kubernetes.io/projected/c7f5e091-d02e-45aa-bc15-58841948bcd6-kube-api-access-bmz7j\") pod \"c7f5e091-d02e-45aa-bc15-58841948bcd6\" (UID: \"c7f5e091-d02e-45aa-bc15-58841948bcd6\") " Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477451 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477467 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3b8246-69d2-405e-a414-c21b9cb3b31d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.477480 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqsl6\" (UniqueName: \"kubernetes.io/projected/7a3b8246-69d2-405e-a414-c21b9cb3b31d-kube-api-access-zqsl6\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.478337 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-utilities" (OuterVolumeSpecName: "utilities") pod "c7f5e091-d02e-45aa-bc15-58841948bcd6" (UID: "c7f5e091-d02e-45aa-bc15-58841948bcd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.478478 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-utilities" (OuterVolumeSpecName: "utilities") pod "57f5ce4d-cf29-4d65-aa17-f23c042a2602" (UID: "57f5ce4d-cf29-4d65-aa17-f23c042a2602"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.478607 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/884c29cd-b721-400c-b319-510f191f02dd-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "884c29cd-b721-400c-b319-510f191f02dd" (UID: "884c29cd-b721-400c-b319-510f191f02dd"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.481767 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f5ce4d-cf29-4d65-aa17-f23c042a2602-kube-api-access-psxtp" (OuterVolumeSpecName: "kube-api-access-psxtp") pod "57f5ce4d-cf29-4d65-aa17-f23c042a2602" (UID: "57f5ce4d-cf29-4d65-aa17-f23c042a2602"). InnerVolumeSpecName "kube-api-access-psxtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.481992 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884c29cd-b721-400c-b319-510f191f02dd-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "884c29cd-b721-400c-b319-510f191f02dd" (UID: "884c29cd-b721-400c-b319-510f191f02dd"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.483194 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884c29cd-b721-400c-b319-510f191f02dd-kube-api-access-xwhkq" (OuterVolumeSpecName: "kube-api-access-xwhkq") pod "884c29cd-b721-400c-b319-510f191f02dd" (UID: "884c29cd-b721-400c-b319-510f191f02dd"). InnerVolumeSpecName "kube-api-access-xwhkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.490696 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f5e091-d02e-45aa-bc15-58841948bcd6-kube-api-access-bmz7j" (OuterVolumeSpecName: "kube-api-access-bmz7j") pod "c7f5e091-d02e-45aa-bc15-58841948bcd6" (UID: "c7f5e091-d02e-45aa-bc15-58841948bcd6"). InnerVolumeSpecName "kube-api-access-bmz7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.502105 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57f5ce4d-cf29-4d65-aa17-f23c042a2602" (UID: "57f5ce4d-cf29-4d65-aa17-f23c042a2602"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.578253 4750 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884c29cd-b721-400c-b319-510f191f02dd-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.578289 4750 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/884c29cd-b721-400c-b319-510f191f02dd-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.578299 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwhkq\" (UniqueName: \"kubernetes.io/projected/884c29cd-b721-400c-b319-510f191f02dd-kube-api-access-xwhkq\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.578308 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.578317 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.578324 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f5ce4d-cf29-4d65-aa17-f23c042a2602-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.578333 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psxtp\" (UniqueName: \"kubernetes.io/projected/57f5ce4d-cf29-4d65-aa17-f23c042a2602-kube-api-access-psxtp\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.578344 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmz7j\" (UniqueName: \"kubernetes.io/projected/c7f5e091-d02e-45aa-bc15-58841948bcd6-kube-api-access-bmz7j\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.592207 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7f5e091-d02e-45aa-bc15-58841948bcd6" (UID: "c7f5e091-d02e-45aa-bc15-58841948bcd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.679267 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7f5e091-d02e-45aa-bc15-58841948bcd6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.696898 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xspt4"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.705441 4750 generic.go:334] "Generic (PLEG): container finished" podID="884c29cd-b721-400c-b319-510f191f02dd" containerID="4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22" exitCode=0 Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.705486 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.705562 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" event={"ID":"884c29cd-b721-400c-b319-510f191f02dd","Type":"ContainerDied","Data":"4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.705610 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hg6dd" event={"ID":"884c29cd-b721-400c-b319-510f191f02dd","Type":"ContainerDied","Data":"3cfce55127e4cf1976066d168746f6616ca371b83bf3acaebd6ddd49f7f2a57a"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.705632 4750 scope.go:117] "RemoveContainer" containerID="4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.710437 4750 generic.go:334] "Generic (PLEG): container finished" podID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerID="c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2" exitCode=0 Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.710481 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d24c6" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.710512 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d24c6" event={"ID":"c7f5e091-d02e-45aa-bc15-58841948bcd6","Type":"ContainerDied","Data":"c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.710539 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d24c6" event={"ID":"c7f5e091-d02e-45aa-bc15-58841948bcd6","Type":"ContainerDied","Data":"3f0a507d8bfc00255263057823d679173972e475544e7e2b3ac7295a82421573"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.714003 4750 generic.go:334] "Generic (PLEG): container finished" podID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerID="c2994da194a0aec8c1f131b849075d8d0e601bcfb43fe1bcffc306b471f4ef2a" exitCode=0 Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.714082 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhkml" event={"ID":"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9","Type":"ContainerDied","Data":"c2994da194a0aec8c1f131b849075d8d0e601bcfb43fe1bcffc306b471f4ef2a"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.716643 4750 generic.go:334] "Generic (PLEG): container finished" podID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerID="850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37" exitCode=0 Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.716690 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d966q" event={"ID":"7a3b8246-69d2-405e-a414-c21b9cb3b31d","Type":"ContainerDied","Data":"850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.716709 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d966q" event={"ID":"7a3b8246-69d2-405e-a414-c21b9cb3b31d","Type":"ContainerDied","Data":"35a8a4afb9645c4fafa3eb57d19a93c0da01582b4b80f7bcab62d45b5cec5f6c"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.716706 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d966q" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.730267 4750 generic.go:334] "Generic (PLEG): container finished" podID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerID="de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20" exitCode=0 Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.730331 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpm4k" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.730337 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpm4k" event={"ID":"57f5ce4d-cf29-4d65-aa17-f23c042a2602","Type":"ContainerDied","Data":"de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.730435 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpm4k" event={"ID":"57f5ce4d-cf29-4d65-aa17-f23c042a2602","Type":"ContainerDied","Data":"9db0775e990729dd8d700fed93c810afec78115bfeda69f77db0f3a2e3ff8f81"} Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.734744 4750 scope.go:117] "RemoveContainer" containerID="4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.735835 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22\": container with ID starting with 4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22 not found: ID does not exist" containerID="4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.735931 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22"} err="failed to get container status \"4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22\": rpc error: code = NotFound desc = could not find container \"4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22\": container with ID starting with 4e4ae53c60ee1f99e54a7e08c9e66c23d9c1668ff837abebfa68c409de485b22 not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.735954 4750 scope.go:117] "RemoveContainer" containerID="c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.739343 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg6dd"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.742119 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg6dd"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.751713 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d24c6"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.753633 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d24c6"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.758157 4750 scope.go:117] "RemoveContainer" containerID="88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.766931 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d966q"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.769571 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d966q"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.781668 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpm4k"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.781848 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpm4k"] Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.787282 4750 scope.go:117] "RemoveContainer" containerID="c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.812083 4750 scope.go:117] "RemoveContainer" containerID="c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.812592 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2\": container with ID starting with c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2 not found: ID does not exist" containerID="c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.812624 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2"} err="failed to get container status \"c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2\": rpc error: code = NotFound desc = could not find container \"c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2\": container with ID starting with c3d543f50bda545a29dc5d7eb7a53b1cc2c620e17b921c48d730aa659c0edda2 not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.812645 4750 scope.go:117] "RemoveContainer" containerID="88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.812959 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721\": container with ID starting with 88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721 not found: ID does not exist" containerID="88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.813000 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721"} err="failed to get container status \"88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721\": rpc error: code = NotFound desc = could not find container \"88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721\": container with ID starting with 88e17cc51bbe973b7ff19ccafc42e9e585034c39f175335750b222631493b721 not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.813028 4750 scope.go:117] "RemoveContainer" containerID="c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.813368 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f\": container with ID starting with c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f not found: ID does not exist" containerID="c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.813410 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f"} err="failed to get container status \"c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f\": rpc error: code = NotFound desc = could not find container \"c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f\": container with ID starting with c5bfa34a77b12aca1bee68a411dbae7673d1a673c81e1748c6a1bf0926197a2f not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.813436 4750 scope.go:117] "RemoveContainer" containerID="850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.831089 4750 scope.go:117] "RemoveContainer" containerID="fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.844381 4750 scope.go:117] "RemoveContainer" containerID="6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.871170 4750 scope.go:117] "RemoveContainer" containerID="850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.871915 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37\": container with ID starting with 850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37 not found: ID does not exist" containerID="850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.871974 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37"} err="failed to get container status \"850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37\": rpc error: code = NotFound desc = could not find container \"850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37\": container with ID starting with 850aad94aeff9920e928e3b467de6f4b4dbbfe2403c6e53e84d34bc59dd99f37 not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.872018 4750 scope.go:117] "RemoveContainer" containerID="fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.872404 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8\": container with ID starting with fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8 not found: ID does not exist" containerID="fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.872431 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8"} err="failed to get container status \"fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8\": rpc error: code = NotFound desc = could not find container \"fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8\": container with ID starting with fda9aa38596a91e57398de8f2607cb22334f4ed276bec00f95d3f73ded75a2e8 not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.872453 4750 scope.go:117] "RemoveContainer" containerID="6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.872981 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d\": container with ID starting with 6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d not found: ID does not exist" containerID="6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.873016 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d"} err="failed to get container status \"6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d\": rpc error: code = NotFound desc = could not find container \"6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d\": container with ID starting with 6ce658966eefd0d2045d31d53ee58080b4196f8f721e431f9fc7772a7138bc2d not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.873041 4750 scope.go:117] "RemoveContainer" containerID="de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.887311 4750 scope.go:117] "RemoveContainer" containerID="a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.902331 4750 scope.go:117] "RemoveContainer" containerID="e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.917449 4750 scope.go:117] "RemoveContainer" containerID="de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.917999 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20\": container with ID starting with de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20 not found: ID does not exist" containerID="de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.918027 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20"} err="failed to get container status \"de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20\": rpc error: code = NotFound desc = could not find container \"de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20\": container with ID starting with de0d4c7a95db56ad197915a151efda43baf8e0711db4495c285fbc031a7dae20 not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.918048 4750 scope.go:117] "RemoveContainer" containerID="a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.918350 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9\": container with ID starting with a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9 not found: ID does not exist" containerID="a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.918373 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9"} err="failed to get container status \"a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9\": rpc error: code = NotFound desc = could not find container \"a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9\": container with ID starting with a549ad25fa6c5b2c17ca7fbfdf3190d83f4ad2a6bb01e3725bb1c36e83ba02a9 not found: ID does not exist" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.918390 4750 scope.go:117] "RemoveContainer" containerID="e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2" Oct 08 18:15:07 crc kubenswrapper[4750]: E1008 18:15:07.918675 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2\": container with ID starting with e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2 not found: ID does not exist" containerID="e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2" Oct 08 18:15:07 crc kubenswrapper[4750]: I1008 18:15:07.918696 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2"} err="failed to get container status \"e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2\": rpc error: code = NotFound desc = could not find container \"e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2\": container with ID starting with e11ce50ba6baf9c810ca6caf44f8ab624bdb32cd43e2a1ee059081a5dc44f6c2 not found: ID does not exist" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.135529 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.285919 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-utilities\") pod \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.285977 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94fsm\" (UniqueName: \"kubernetes.io/projected/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-kube-api-access-94fsm\") pod \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.286031 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-catalog-content\") pod \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\" (UID: \"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9\") " Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.287509 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-utilities" (OuterVolumeSpecName: "utilities") pod "158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" (UID: "158d74a9-bbc8-4cd9-9507-eb477aa3a5a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.296165 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-kube-api-access-94fsm" (OuterVolumeSpecName: "kube-api-access-94fsm") pod "158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" (UID: "158d74a9-bbc8-4cd9-9507-eb477aa3a5a9"). InnerVolumeSpecName "kube-api-access-94fsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.343289 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" (UID: "158d74a9-bbc8-4cd9-9507-eb477aa3a5a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.387286 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.387323 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94fsm\" (UniqueName: \"kubernetes.io/projected/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-kube-api-access-94fsm\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.387334 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.738764 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" path="/var/lib/kubelet/pods/57f5ce4d-cf29-4d65-aa17-f23c042a2602/volumes" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.739372 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" path="/var/lib/kubelet/pods/7a3b8246-69d2-405e-a414-c21b9cb3b31d/volumes" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.740094 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884c29cd-b721-400c-b319-510f191f02dd" path="/var/lib/kubelet/pods/884c29cd-b721-400c-b319-510f191f02dd/volumes" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.741039 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" path="/var/lib/kubelet/pods/c7f5e091-d02e-45aa-bc15-58841948bcd6/volumes" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.743969 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhkml" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.743968 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhkml" event={"ID":"158d74a9-bbc8-4cd9-9507-eb477aa3a5a9","Type":"ContainerDied","Data":"6f31845f1acc1c240b1c1b1751759a7a20093283610b710cfcf724a4243197dd"} Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.744118 4750 scope.go:117] "RemoveContainer" containerID="c2994da194a0aec8c1f131b849075d8d0e601bcfb43fe1bcffc306b471f4ef2a" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.746895 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" event={"ID":"e5cb56e3-bb34-427d-b4c1-7dec95f40023","Type":"ContainerStarted","Data":"63cd4a229c43912f9ec5e888353ac65ec95464ecfeb1379bcb8779924cb81993"} Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.746929 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" event={"ID":"e5cb56e3-bb34-427d-b4c1-7dec95f40023","Type":"ContainerStarted","Data":"af495477745de2cb9e6026c4b8006335dc1681407d8125f55a59f01cb9112dac"} Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.747253 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.750603 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.761009 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xspt4" podStartSLOduration=2.760993132 podStartE2EDuration="2.760993132s" podCreationTimestamp="2025-10-08 18:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:15:08.759015461 +0000 UTC m=+264.671986474" watchObservedRunningTime="2025-10-08 18:15:08.760993132 +0000 UTC m=+264.673964145" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.764932 4750 scope.go:117] "RemoveContainer" containerID="f2959ab4a0ab540db14046db17a204910d3c6a3d33b91b7e042920f311139411" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.788352 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhkml"] Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.794763 4750 scope.go:117] "RemoveContainer" containerID="26053d35d305487354ca5045e53912c890b03f9eaa942522bbdc38571da40cdb" Oct 08 18:15:08 crc kubenswrapper[4750]: I1008 18:15:08.797019 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mhkml"] Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.095622 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lnrjm"] Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096100 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096111 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096124 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerName="extract-utilities" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096129 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerName="extract-utilities" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096136 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096143 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096153 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096158 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096165 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerName="extract-utilities" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096170 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerName="extract-utilities" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096179 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884c29cd-b721-400c-b319-510f191f02dd" containerName="marketplace-operator" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096185 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="884c29cd-b721-400c-b319-510f191f02dd" containerName="marketplace-operator" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096193 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerName="extract-content" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096198 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerName="extract-content" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096204 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerName="extract-content" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096211 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerName="extract-content" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096218 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096223 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096233 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerName="extract-utilities" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096238 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerName="extract-utilities" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096246 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerName="extract-utilities" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096251 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerName="extract-utilities" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096258 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerName="extract-content" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096263 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerName="extract-content" Oct 08 18:15:09 crc kubenswrapper[4750]: E1008 18:15:09.096272 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerName="extract-content" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096277 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerName="extract-content" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096357 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f5ce4d-cf29-4d65-aa17-f23c042a2602" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096367 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096376 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f5e091-d02e-45aa-bc15-58841948bcd6" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096384 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="884c29cd-b721-400c-b319-510f191f02dd" containerName="marketplace-operator" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.096391 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3b8246-69d2-405e-a414-c21b9cb3b31d" containerName="registry-server" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.097227 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.099719 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.104306 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lnrjm"] Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.196529 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56032d2a-4cde-430a-a40e-ab1eed32b651-catalog-content\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.196629 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56032d2a-4cde-430a-a40e-ab1eed32b651-utilities\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.196667 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkfg\" (UniqueName: \"kubernetes.io/projected/56032d2a-4cde-430a-a40e-ab1eed32b651-kube-api-access-sdkfg\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.294697 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9trpl"] Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.295621 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.297323 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56032d2a-4cde-430a-a40e-ab1eed32b651-catalog-content\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.297383 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56032d2a-4cde-430a-a40e-ab1eed32b651-utilities\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.297417 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdkfg\" (UniqueName: \"kubernetes.io/projected/56032d2a-4cde-430a-a40e-ab1eed32b651-kube-api-access-sdkfg\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.298199 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56032d2a-4cde-430a-a40e-ab1eed32b651-catalog-content\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.298451 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56032d2a-4cde-430a-a40e-ab1eed32b651-utilities\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.298850 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.305140 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9trpl"] Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.316883 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdkfg\" (UniqueName: \"kubernetes.io/projected/56032d2a-4cde-430a-a40e-ab1eed32b651-kube-api-access-sdkfg\") pod \"certified-operators-lnrjm\" (UID: \"56032d2a-4cde-430a-a40e-ab1eed32b651\") " pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.398763 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a62523-66bd-4167-890d-2cddc40f8695-utilities\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.398845 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjsv4\" (UniqueName: \"kubernetes.io/projected/e3a62523-66bd-4167-890d-2cddc40f8695-kube-api-access-hjsv4\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.398866 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a62523-66bd-4167-890d-2cddc40f8695-catalog-content\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.424103 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.499837 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a62523-66bd-4167-890d-2cddc40f8695-utilities\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.499897 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjsv4\" (UniqueName: \"kubernetes.io/projected/e3a62523-66bd-4167-890d-2cddc40f8695-kube-api-access-hjsv4\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.499928 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a62523-66bd-4167-890d-2cddc40f8695-catalog-content\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.500349 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a62523-66bd-4167-890d-2cddc40f8695-catalog-content\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.500563 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a62523-66bd-4167-890d-2cddc40f8695-utilities\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.527622 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjsv4\" (UniqueName: \"kubernetes.io/projected/e3a62523-66bd-4167-890d-2cddc40f8695-kube-api-access-hjsv4\") pod \"redhat-marketplace-9trpl\" (UID: \"e3a62523-66bd-4167-890d-2cddc40f8695\") " pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.608745 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.641897 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lnrjm"] Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.762694 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnrjm" event={"ID":"56032d2a-4cde-430a-a40e-ab1eed32b651","Type":"ContainerStarted","Data":"b1b9941a82d85e59ae4fad18f7d044c404f11c43023d7c25acc730615bf7af07"} Oct 08 18:15:09 crc kubenswrapper[4750]: I1008 18:15:09.817489 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9trpl"] Oct 08 18:15:10 crc kubenswrapper[4750]: I1008 18:15:10.741487 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158d74a9-bbc8-4cd9-9507-eb477aa3a5a9" path="/var/lib/kubelet/pods/158d74a9-bbc8-4cd9-9507-eb477aa3a5a9/volumes" Oct 08 18:15:10 crc kubenswrapper[4750]: I1008 18:15:10.783505 4750 generic.go:334] "Generic (PLEG): container finished" podID="56032d2a-4cde-430a-a40e-ab1eed32b651" containerID="d711ece9361a65b8ab54604285a951d77eaf0277b1457d36c3cc091a7ee8a54a" exitCode=0 Oct 08 18:15:10 crc kubenswrapper[4750]: I1008 18:15:10.784145 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnrjm" event={"ID":"56032d2a-4cde-430a-a40e-ab1eed32b651","Type":"ContainerDied","Data":"d711ece9361a65b8ab54604285a951d77eaf0277b1457d36c3cc091a7ee8a54a"} Oct 08 18:15:10 crc kubenswrapper[4750]: I1008 18:15:10.786680 4750 generic.go:334] "Generic (PLEG): container finished" podID="e3a62523-66bd-4167-890d-2cddc40f8695" containerID="364539b867c6530f0958f4ab645b2372068c487c1fa908c535d57ed948beb90f" exitCode=0 Oct 08 18:15:10 crc kubenswrapper[4750]: I1008 18:15:10.787227 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9trpl" event={"ID":"e3a62523-66bd-4167-890d-2cddc40f8695","Type":"ContainerDied","Data":"364539b867c6530f0958f4ab645b2372068c487c1fa908c535d57ed948beb90f"} Oct 08 18:15:10 crc kubenswrapper[4750]: I1008 18:15:10.787258 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9trpl" event={"ID":"e3a62523-66bd-4167-890d-2cddc40f8695","Type":"ContainerStarted","Data":"6ac6139ced131c9d63014ed75821eb00dc2216ba54ecd46c16f75bf211b53dbc"} Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.498621 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v4wpt"] Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.499927 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.503019 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.506367 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v4wpt"] Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.626009 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-utilities\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.626521 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-catalog-content\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.626629 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bqnr\" (UniqueName: \"kubernetes.io/projected/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-kube-api-access-4bqnr\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.703763 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vn7zk"] Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.705087 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.708384 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.711709 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vn7zk"] Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.727675 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bqnr\" (UniqueName: \"kubernetes.io/projected/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-kube-api-access-4bqnr\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.728344 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-utilities\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.728406 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-catalog-content\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.728917 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-catalog-content\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.731846 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-utilities\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.747769 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bqnr\" (UniqueName: \"kubernetes.io/projected/19c82dda-1fe3-489d-a6bb-28bd646fc3ad-kube-api-access-4bqnr\") pod \"redhat-operators-v4wpt\" (UID: \"19c82dda-1fe3-489d-a6bb-28bd646fc3ad\") " pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.792686 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnrjm" event={"ID":"56032d2a-4cde-430a-a40e-ab1eed32b651","Type":"ContainerStarted","Data":"db9cf5b8f2610a5fbc420a3f83cbc9bfa1ff61b998d7d2dc98c87530d73cd233"} Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.793959 4750 generic.go:334] "Generic (PLEG): container finished" podID="e3a62523-66bd-4167-890d-2cddc40f8695" containerID="01e4ea4172483548f0f9e04864ca821fbceca418c5a25fd74bda010752c28043" exitCode=0 Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.794049 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9trpl" event={"ID":"e3a62523-66bd-4167-890d-2cddc40f8695","Type":"ContainerDied","Data":"01e4ea4172483548f0f9e04864ca821fbceca418c5a25fd74bda010752c28043"} Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.829134 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d789a75-07c0-4e1d-889b-8d2221b1ff95-utilities\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.829264 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5679d\" (UniqueName: \"kubernetes.io/projected/0d789a75-07c0-4e1d-889b-8d2221b1ff95-kube-api-access-5679d\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.829532 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d789a75-07c0-4e1d-889b-8d2221b1ff95-catalog-content\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.857421 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.930405 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5679d\" (UniqueName: \"kubernetes.io/projected/0d789a75-07c0-4e1d-889b-8d2221b1ff95-kube-api-access-5679d\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.930707 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d789a75-07c0-4e1d-889b-8d2221b1ff95-catalog-content\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.930751 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d789a75-07c0-4e1d-889b-8d2221b1ff95-utilities\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.931323 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d789a75-07c0-4e1d-889b-8d2221b1ff95-catalog-content\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.931372 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d789a75-07c0-4e1d-889b-8d2221b1ff95-utilities\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:11 crc kubenswrapper[4750]: I1008 18:15:11.952579 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5679d\" (UniqueName: \"kubernetes.io/projected/0d789a75-07c0-4e1d-889b-8d2221b1ff95-kube-api-access-5679d\") pod \"community-operators-vn7zk\" (UID: \"0d789a75-07c0-4e1d-889b-8d2221b1ff95\") " pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.018369 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.221126 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v4wpt"] Oct 08 18:15:12 crc kubenswrapper[4750]: W1008 18:15:12.228317 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19c82dda_1fe3_489d_a6bb_28bd646fc3ad.slice/crio-91e932170281ad49f5c05080c2d08e2a3bfce1638847ed9c5078b2f9dbd21280 WatchSource:0}: Error finding container 91e932170281ad49f5c05080c2d08e2a3bfce1638847ed9c5078b2f9dbd21280: Status 404 returned error can't find the container with id 91e932170281ad49f5c05080c2d08e2a3bfce1638847ed9c5078b2f9dbd21280 Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.434337 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vn7zk"] Oct 08 18:15:12 crc kubenswrapper[4750]: W1008 18:15:12.464225 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d789a75_07c0_4e1d_889b_8d2221b1ff95.slice/crio-e9fd58fa70d1fb6e4448c86f99d9f2c39214feee9c0b9233ec3ce8030a0e22cb WatchSource:0}: Error finding container e9fd58fa70d1fb6e4448c86f99d9f2c39214feee9c0b9233ec3ce8030a0e22cb: Status 404 returned error can't find the container with id e9fd58fa70d1fb6e4448c86f99d9f2c39214feee9c0b9233ec3ce8030a0e22cb Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.799802 4750 generic.go:334] "Generic (PLEG): container finished" podID="0d789a75-07c0-4e1d-889b-8d2221b1ff95" containerID="abe2b4abb9918629e30584aef1cd873380dbf087fe64acb08b1b530253652354" exitCode=0 Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.799903 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn7zk" event={"ID":"0d789a75-07c0-4e1d-889b-8d2221b1ff95","Type":"ContainerDied","Data":"abe2b4abb9918629e30584aef1cd873380dbf087fe64acb08b1b530253652354"} Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.801693 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn7zk" event={"ID":"0d789a75-07c0-4e1d-889b-8d2221b1ff95","Type":"ContainerStarted","Data":"e9fd58fa70d1fb6e4448c86f99d9f2c39214feee9c0b9233ec3ce8030a0e22cb"} Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.804584 4750 generic.go:334] "Generic (PLEG): container finished" podID="19c82dda-1fe3-489d-a6bb-28bd646fc3ad" containerID="3760d25c4c31506ef5be8ab160723efbea7c8c634b15a478e0add61b6a612a91" exitCode=0 Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.804673 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4wpt" event={"ID":"19c82dda-1fe3-489d-a6bb-28bd646fc3ad","Type":"ContainerDied","Data":"3760d25c4c31506ef5be8ab160723efbea7c8c634b15a478e0add61b6a612a91"} Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.804803 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4wpt" event={"ID":"19c82dda-1fe3-489d-a6bb-28bd646fc3ad","Type":"ContainerStarted","Data":"91e932170281ad49f5c05080c2d08e2a3bfce1638847ed9c5078b2f9dbd21280"} Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.809895 4750 generic.go:334] "Generic (PLEG): container finished" podID="56032d2a-4cde-430a-a40e-ab1eed32b651" containerID="db9cf5b8f2610a5fbc420a3f83cbc9bfa1ff61b998d7d2dc98c87530d73cd233" exitCode=0 Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.809965 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnrjm" event={"ID":"56032d2a-4cde-430a-a40e-ab1eed32b651","Type":"ContainerDied","Data":"db9cf5b8f2610a5fbc420a3f83cbc9bfa1ff61b998d7d2dc98c87530d73cd233"} Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.813126 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9trpl" event={"ID":"e3a62523-66bd-4167-890d-2cddc40f8695","Type":"ContainerStarted","Data":"360a7118788e2d9de6838f5fb6446fee8c6674738b59012801cf72e2db7d6307"} Oct 08 18:15:12 crc kubenswrapper[4750]: I1008 18:15:12.865315 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9trpl" podStartSLOduration=2.414913449 podStartE2EDuration="3.865296597s" podCreationTimestamp="2025-10-08 18:15:09 +0000 UTC" firstStartedPulling="2025-10-08 18:15:10.789699898 +0000 UTC m=+266.702670911" lastFinishedPulling="2025-10-08 18:15:12.240083046 +0000 UTC m=+268.153054059" observedRunningTime="2025-10-08 18:15:12.864696493 +0000 UTC m=+268.777667526" watchObservedRunningTime="2025-10-08 18:15:12.865296597 +0000 UTC m=+268.778267630" Oct 08 18:15:13 crc kubenswrapper[4750]: I1008 18:15:13.819181 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4wpt" event={"ID":"19c82dda-1fe3-489d-a6bb-28bd646fc3ad","Type":"ContainerStarted","Data":"ce7412b64302b6ed08433b867ac075c889e8d68a191608754b5e6a2a455bc003"} Oct 08 18:15:13 crc kubenswrapper[4750]: I1008 18:15:13.823344 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnrjm" event={"ID":"56032d2a-4cde-430a-a40e-ab1eed32b651","Type":"ContainerStarted","Data":"fc0458d44b9dd84fe6de76c0d3dd8aa9108a04f82f602a2122c2477ae14e1558"} Oct 08 18:15:13 crc kubenswrapper[4750]: I1008 18:15:13.825090 4750 generic.go:334] "Generic (PLEG): container finished" podID="0d789a75-07c0-4e1d-889b-8d2221b1ff95" containerID="ddd8a75be226bc6e37ef4bc21b0ba324d3588cb38c3d55324779c3ad3227b98e" exitCode=0 Oct 08 18:15:13 crc kubenswrapper[4750]: I1008 18:15:13.825517 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn7zk" event={"ID":"0d789a75-07c0-4e1d-889b-8d2221b1ff95","Type":"ContainerDied","Data":"ddd8a75be226bc6e37ef4bc21b0ba324d3588cb38c3d55324779c3ad3227b98e"} Oct 08 18:15:13 crc kubenswrapper[4750]: I1008 18:15:13.859342 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lnrjm" podStartSLOduration=2.120315493 podStartE2EDuration="4.859328532s" podCreationTimestamp="2025-10-08 18:15:09 +0000 UTC" firstStartedPulling="2025-10-08 18:15:10.786477889 +0000 UTC m=+266.699448902" lastFinishedPulling="2025-10-08 18:15:13.525490928 +0000 UTC m=+269.438461941" observedRunningTime="2025-10-08 18:15:13.856180043 +0000 UTC m=+269.769151056" watchObservedRunningTime="2025-10-08 18:15:13.859328532 +0000 UTC m=+269.772299545" Oct 08 18:15:14 crc kubenswrapper[4750]: I1008 18:15:14.831806 4750 generic.go:334] "Generic (PLEG): container finished" podID="19c82dda-1fe3-489d-a6bb-28bd646fc3ad" containerID="ce7412b64302b6ed08433b867ac075c889e8d68a191608754b5e6a2a455bc003" exitCode=0 Oct 08 18:15:14 crc kubenswrapper[4750]: I1008 18:15:14.831883 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4wpt" event={"ID":"19c82dda-1fe3-489d-a6bb-28bd646fc3ad","Type":"ContainerDied","Data":"ce7412b64302b6ed08433b867ac075c889e8d68a191608754b5e6a2a455bc003"} Oct 08 18:15:15 crc kubenswrapper[4750]: I1008 18:15:15.838363 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4wpt" event={"ID":"19c82dda-1fe3-489d-a6bb-28bd646fc3ad","Type":"ContainerStarted","Data":"4a812d957df60f92d56a8e090a56598004a3808c2ad463e79950f5aa3110de38"} Oct 08 18:15:15 crc kubenswrapper[4750]: I1008 18:15:15.843145 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn7zk" event={"ID":"0d789a75-07c0-4e1d-889b-8d2221b1ff95","Type":"ContainerStarted","Data":"2b7b2af906eedd820f1771e3ff73c9bf2116be14f180ec2eb3c6beadb3960f72"} Oct 08 18:15:15 crc kubenswrapper[4750]: I1008 18:15:15.859679 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v4wpt" podStartSLOduration=2.175477107 podStartE2EDuration="4.859662166s" podCreationTimestamp="2025-10-08 18:15:11 +0000 UTC" firstStartedPulling="2025-10-08 18:15:12.806762864 +0000 UTC m=+268.719733877" lastFinishedPulling="2025-10-08 18:15:15.490947913 +0000 UTC m=+271.403918936" observedRunningTime="2025-10-08 18:15:15.857808095 +0000 UTC m=+271.770779128" watchObservedRunningTime="2025-10-08 18:15:15.859662166 +0000 UTC m=+271.772633179" Oct 08 18:15:15 crc kubenswrapper[4750]: I1008 18:15:15.877110 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vn7zk" podStartSLOduration=3.125238796 podStartE2EDuration="4.877092132s" podCreationTimestamp="2025-10-08 18:15:11 +0000 UTC" firstStartedPulling="2025-10-08 18:15:12.800972669 +0000 UTC m=+268.713943692" lastFinishedPulling="2025-10-08 18:15:14.552826015 +0000 UTC m=+270.465797028" observedRunningTime="2025-10-08 18:15:15.873085158 +0000 UTC m=+271.786056181" watchObservedRunningTime="2025-10-08 18:15:15.877092132 +0000 UTC m=+271.790063145" Oct 08 18:15:19 crc kubenswrapper[4750]: I1008 18:15:19.424506 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:19 crc kubenswrapper[4750]: I1008 18:15:19.425139 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:19 crc kubenswrapper[4750]: I1008 18:15:19.463795 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:19 crc kubenswrapper[4750]: I1008 18:15:19.609239 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:19 crc kubenswrapper[4750]: I1008 18:15:19.609481 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:19 crc kubenswrapper[4750]: I1008 18:15:19.643662 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:19 crc kubenswrapper[4750]: I1008 18:15:19.893565 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lnrjm" Oct 08 18:15:19 crc kubenswrapper[4750]: I1008 18:15:19.893632 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9trpl" Oct 08 18:15:21 crc kubenswrapper[4750]: I1008 18:15:21.857784 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:21 crc kubenswrapper[4750]: I1008 18:15:21.858181 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:21 crc kubenswrapper[4750]: I1008 18:15:21.900269 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:21 crc kubenswrapper[4750]: I1008 18:15:21.940669 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v4wpt" Oct 08 18:15:22 crc kubenswrapper[4750]: I1008 18:15:22.018584 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:22 crc kubenswrapper[4750]: I1008 18:15:22.018826 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:22 crc kubenswrapper[4750]: I1008 18:15:22.058530 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:15:22 crc kubenswrapper[4750]: I1008 18:15:22.910404 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vn7zk" Oct 08 18:16:29 crc kubenswrapper[4750]: I1008 18:16:29.707630 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:16:29 crc kubenswrapper[4750]: I1008 18:16:29.708410 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:16:59 crc kubenswrapper[4750]: I1008 18:16:59.707258 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:16:59 crc kubenswrapper[4750]: I1008 18:16:59.707842 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:17:29 crc kubenswrapper[4750]: I1008 18:17:29.707486 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:17:29 crc kubenswrapper[4750]: I1008 18:17:29.708111 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:17:29 crc kubenswrapper[4750]: I1008 18:17:29.708164 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:17:29 crc kubenswrapper[4750]: I1008 18:17:29.708924 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66ef4ffb40ab7463261e1244d353dfe197b84351b1cf3ab02fb7a03a2d706a56"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:17:29 crc kubenswrapper[4750]: I1008 18:17:29.708996 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://66ef4ffb40ab7463261e1244d353dfe197b84351b1cf3ab02fb7a03a2d706a56" gracePeriod=600 Oct 08 18:17:30 crc kubenswrapper[4750]: I1008 18:17:30.547788 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="66ef4ffb40ab7463261e1244d353dfe197b84351b1cf3ab02fb7a03a2d706a56" exitCode=0 Oct 08 18:17:30 crc kubenswrapper[4750]: I1008 18:17:30.547934 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"66ef4ffb40ab7463261e1244d353dfe197b84351b1cf3ab02fb7a03a2d706a56"} Oct 08 18:17:30 crc kubenswrapper[4750]: I1008 18:17:30.548908 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"2b873e94d0aa696b6eab6a4a655b8fb4893434625aed946509deb5ffaef26cc6"} Oct 08 18:17:30 crc kubenswrapper[4750]: I1008 18:17:30.548957 4750 scope.go:117] "RemoveContainer" containerID="1b75c94360320f8907419e9d7b9a237989702c6c66dd2e87832e18c2c4570c3f" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.179138 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wz76c"] Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.180825 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.202381 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wz76c"] Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.367867 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e11199c0-2867-4d77-b184-fd0d1a78fe98-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.367943 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.367982 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e11199c0-2867-4d77-b184-fd0d1a78fe98-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.368007 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-registry-tls\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.368033 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e11199c0-2867-4d77-b184-fd0d1a78fe98-registry-certificates\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.368063 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-bound-sa-token\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.368267 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e11199c0-2867-4d77-b184-fd0d1a78fe98-trusted-ca\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.368342 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cph47\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-kube-api-access-cph47\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.398185 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.469515 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cph47\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-kube-api-access-cph47\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.469642 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e11199c0-2867-4d77-b184-fd0d1a78fe98-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.469704 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e11199c0-2867-4d77-b184-fd0d1a78fe98-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.469729 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-registry-tls\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.469757 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e11199c0-2867-4d77-b184-fd0d1a78fe98-registry-certificates\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.469785 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-bound-sa-token\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.469893 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e11199c0-2867-4d77-b184-fd0d1a78fe98-trusted-ca\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.470308 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e11199c0-2867-4d77-b184-fd0d1a78fe98-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.471621 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e11199c0-2867-4d77-b184-fd0d1a78fe98-registry-certificates\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.471692 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e11199c0-2867-4d77-b184-fd0d1a78fe98-trusted-ca\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.481069 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e11199c0-2867-4d77-b184-fd0d1a78fe98-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.481190 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-registry-tls\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.492599 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-bound-sa-token\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.500486 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cph47\" (UniqueName: \"kubernetes.io/projected/e11199c0-2867-4d77-b184-fd0d1a78fe98-kube-api-access-cph47\") pod \"image-registry-66df7c8f76-wz76c\" (UID: \"e11199c0-2867-4d77-b184-fd0d1a78fe98\") " pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:48 crc kubenswrapper[4750]: I1008 18:18:48.799996 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:49 crc kubenswrapper[4750]: I1008 18:18:49.259167 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wz76c"] Oct 08 18:18:50 crc kubenswrapper[4750]: I1008 18:18:50.014743 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" event={"ID":"e11199c0-2867-4d77-b184-fd0d1a78fe98","Type":"ContainerStarted","Data":"203f4f93a5ae095f5c4fd8f1caeb3d32b804850a8f454cac7b27aa8a935eb55d"} Oct 08 18:18:50 crc kubenswrapper[4750]: I1008 18:18:50.014812 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" event={"ID":"e11199c0-2867-4d77-b184-fd0d1a78fe98","Type":"ContainerStarted","Data":"976ea304ddf43ffe89cf5070c3ede6ec2915a7ad747aa189ab7429fe6c911e47"} Oct 08 18:18:50 crc kubenswrapper[4750]: I1008 18:18:50.014932 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:18:50 crc kubenswrapper[4750]: I1008 18:18:50.038098 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" podStartSLOduration=2.038059115 podStartE2EDuration="2.038059115s" podCreationTimestamp="2025-10-08 18:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:18:50.035342886 +0000 UTC m=+485.948313919" watchObservedRunningTime="2025-10-08 18:18:50.038059115 +0000 UTC m=+485.951030168" Oct 08 18:19:08 crc kubenswrapper[4750]: I1008 18:19:08.807584 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wz76c" Oct 08 18:19:08 crc kubenswrapper[4750]: I1008 18:19:08.893415 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2cxc5"] Oct 08 18:19:29 crc kubenswrapper[4750]: I1008 18:19:29.706784 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:19:29 crc kubenswrapper[4750]: I1008 18:19:29.707308 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:19:33 crc kubenswrapper[4750]: I1008 18:19:33.948223 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" podUID="e2508573-4890-48b6-9119-93560ee4c5d9" containerName="registry" containerID="cri-o://e3d9f8f9b1fb9e1fa49bbb7b7d63a207e853509d90d0de48fa49cf9b5ec305ae" gracePeriod=30 Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.275286 4750 generic.go:334] "Generic (PLEG): container finished" podID="e2508573-4890-48b6-9119-93560ee4c5d9" containerID="e3d9f8f9b1fb9e1fa49bbb7b7d63a207e853509d90d0de48fa49cf9b5ec305ae" exitCode=0 Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.275340 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" event={"ID":"e2508573-4890-48b6-9119-93560ee4c5d9","Type":"ContainerDied","Data":"e3d9f8f9b1fb9e1fa49bbb7b7d63a207e853509d90d0de48fa49cf9b5ec305ae"} Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.367622 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.509933 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsm5j\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-kube-api-access-jsm5j\") pod \"e2508573-4890-48b6-9119-93560ee4c5d9\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.509987 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-bound-sa-token\") pod \"e2508573-4890-48b6-9119-93560ee4c5d9\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.510036 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2508573-4890-48b6-9119-93560ee4c5d9-installation-pull-secrets\") pod \"e2508573-4890-48b6-9119-93560ee4c5d9\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.510090 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-registry-certificates\") pod \"e2508573-4890-48b6-9119-93560ee4c5d9\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.510308 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e2508573-4890-48b6-9119-93560ee4c5d9\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.510350 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-trusted-ca\") pod \"e2508573-4890-48b6-9119-93560ee4c5d9\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.510377 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-registry-tls\") pod \"e2508573-4890-48b6-9119-93560ee4c5d9\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.510426 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2508573-4890-48b6-9119-93560ee4c5d9-ca-trust-extracted\") pod \"e2508573-4890-48b6-9119-93560ee4c5d9\" (UID: \"e2508573-4890-48b6-9119-93560ee4c5d9\") " Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.511647 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2508573-4890-48b6-9119-93560ee4c5d9" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.511709 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2508573-4890-48b6-9119-93560ee4c5d9" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.516737 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2508573-4890-48b6-9119-93560ee4c5d9" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.520849 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2508573-4890-48b6-9119-93560ee4c5d9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2508573-4890-48b6-9119-93560ee4c5d9" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.522031 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-kube-api-access-jsm5j" (OuterVolumeSpecName: "kube-api-access-jsm5j") pod "e2508573-4890-48b6-9119-93560ee4c5d9" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9"). InnerVolumeSpecName "kube-api-access-jsm5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.522390 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2508573-4890-48b6-9119-93560ee4c5d9" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.523415 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e2508573-4890-48b6-9119-93560ee4c5d9" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.534231 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2508573-4890-48b6-9119-93560ee4c5d9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2508573-4890-48b6-9119-93560ee4c5d9" (UID: "e2508573-4890-48b6-9119-93560ee4c5d9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.611710 4750 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2508573-4890-48b6-9119-93560ee4c5d9-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.611747 4750 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.611758 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2508573-4890-48b6-9119-93560ee4c5d9-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.611767 4750 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.611776 4750 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2508573-4890-48b6-9119-93560ee4c5d9-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.611785 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsm5j\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-kube-api-access-jsm5j\") on node \"crc\" DevicePath \"\"" Oct 08 18:19:34 crc kubenswrapper[4750]: I1008 18:19:34.611794 4750 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2508573-4890-48b6-9119-93560ee4c5d9-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 18:19:35 crc kubenswrapper[4750]: I1008 18:19:35.283000 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" event={"ID":"e2508573-4890-48b6-9119-93560ee4c5d9","Type":"ContainerDied","Data":"57364b7dacfb9035c397438ead6b5118634ce30aa73e9551175634af65b84589"} Oct 08 18:19:35 crc kubenswrapper[4750]: I1008 18:19:35.283301 4750 scope.go:117] "RemoveContainer" containerID="e3d9f8f9b1fb9e1fa49bbb7b7d63a207e853509d90d0de48fa49cf9b5ec305ae" Oct 08 18:19:35 crc kubenswrapper[4750]: I1008 18:19:35.283090 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2cxc5" Oct 08 18:19:35 crc kubenswrapper[4750]: I1008 18:19:35.304473 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2cxc5"] Oct 08 18:19:35 crc kubenswrapper[4750]: I1008 18:19:35.323922 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2cxc5"] Oct 08 18:19:36 crc kubenswrapper[4750]: I1008 18:19:36.746727 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2508573-4890-48b6-9119-93560ee4c5d9" path="/var/lib/kubelet/pods/e2508573-4890-48b6-9119-93560ee4c5d9/volumes" Oct 08 18:19:44 crc kubenswrapper[4750]: I1008 18:19:44.883239 4750 scope.go:117] "RemoveContainer" containerID="3d55d1860ee61636abdf9f67cd63a64445ce2652f3816a15edfd2a7054f77dbc" Oct 08 18:19:44 crc kubenswrapper[4750]: I1008 18:19:44.903892 4750 scope.go:117] "RemoveContainer" containerID="1d8dbdda642bfb5003c76831e0d0db760f7b52773b0a8b618fbfa922a888467b" Oct 08 18:19:59 crc kubenswrapper[4750]: I1008 18:19:59.707066 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:19:59 crc kubenswrapper[4750]: I1008 18:19:59.708315 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:20:29 crc kubenswrapper[4750]: I1008 18:20:29.707657 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:20:29 crc kubenswrapper[4750]: I1008 18:20:29.708240 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:20:29 crc kubenswrapper[4750]: I1008 18:20:29.708301 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:20:29 crc kubenswrapper[4750]: I1008 18:20:29.709153 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b873e94d0aa696b6eab6a4a655b8fb4893434625aed946509deb5ffaef26cc6"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:20:29 crc kubenswrapper[4750]: I1008 18:20:29.709258 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://2b873e94d0aa696b6eab6a4a655b8fb4893434625aed946509deb5ffaef26cc6" gracePeriod=600 Oct 08 18:20:30 crc kubenswrapper[4750]: I1008 18:20:30.575170 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="2b873e94d0aa696b6eab6a4a655b8fb4893434625aed946509deb5ffaef26cc6" exitCode=0 Oct 08 18:20:30 crc kubenswrapper[4750]: I1008 18:20:30.575259 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"2b873e94d0aa696b6eab6a4a655b8fb4893434625aed946509deb5ffaef26cc6"} Oct 08 18:20:30 crc kubenswrapper[4750]: I1008 18:20:30.575791 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"4aa9c398c94477f10b8d76ef065deabe11c48fe9c89856d25a3a57b78914e105"} Oct 08 18:20:30 crc kubenswrapper[4750]: I1008 18:20:30.575843 4750 scope.go:117] "RemoveContainer" containerID="66ef4ffb40ab7463261e1244d353dfe197b84351b1cf3ab02fb7a03a2d706a56" Oct 08 18:20:44 crc kubenswrapper[4750]: I1008 18:20:44.946341 4750 scope.go:117] "RemoveContainer" containerID="9e0495eb96731b30d579e8c1f6a0209c4175e746067875021ac2d10345ed08d1" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.321847 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-p4srs"] Oct 08 18:22:05 crc kubenswrapper[4750]: E1008 18:22:05.324126 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2508573-4890-48b6-9119-93560ee4c5d9" containerName="registry" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.324289 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2508573-4890-48b6-9119-93560ee4c5d9" containerName="registry" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.324541 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2508573-4890-48b6-9119-93560ee4c5d9" containerName="registry" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.325332 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.327842 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p4srs"] Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.329201 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.329392 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.329517 4750 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-nchbf" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.329640 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.401193 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9n8\" (UniqueName: \"kubernetes.io/projected/42cddc68-0cc7-4d38-ab9d-cb01de038724-kube-api-access-vp9n8\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.401250 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42cddc68-0cc7-4d38-ab9d-cb01de038724-node-mnt\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.401357 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42cddc68-0cc7-4d38-ab9d-cb01de038724-crc-storage\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.501713 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9n8\" (UniqueName: \"kubernetes.io/projected/42cddc68-0cc7-4d38-ab9d-cb01de038724-kube-api-access-vp9n8\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.501759 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42cddc68-0cc7-4d38-ab9d-cb01de038724-node-mnt\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.501803 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42cddc68-0cc7-4d38-ab9d-cb01de038724-crc-storage\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.502233 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42cddc68-0cc7-4d38-ab9d-cb01de038724-node-mnt\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.502796 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42cddc68-0cc7-4d38-ab9d-cb01de038724-crc-storage\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.521685 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9n8\" (UniqueName: \"kubernetes.io/projected/42cddc68-0cc7-4d38-ab9d-cb01de038724-kube-api-access-vp9n8\") pod \"crc-storage-crc-p4srs\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.644915 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.792925 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rl7f4"] Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.793685 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovn-controller" containerID="cri-o://ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372" gracePeriod=30 Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.793754 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="nbdb" containerID="cri-o://48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a" gracePeriod=30 Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.793833 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="northd" containerID="cri-o://234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9" gracePeriod=30 Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.793904 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6" gracePeriod=30 Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.793899 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="sbdb" containerID="cri-o://621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47" gracePeriod=30 Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.793991 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovn-acl-logging" containerID="cri-o://fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384" gracePeriod=30 Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.794065 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kube-rbac-proxy-node" containerID="cri-o://7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76" gracePeriod=30 Oct 08 18:22:05 crc kubenswrapper[4750]: I1008 18:22:05.824331 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" containerID="cri-o://8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" gracePeriod=30 Oct 08 18:22:05 crc kubenswrapper[4750]: E1008 18:22:05.843611 4750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(6fea9e8d5891bd1845a17bdb4ee886212ba17123851a1c6d10cbf5372b00e118): error adding pod crc-storage_crc-storage-crc-p4srs to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" Oct 08 18:22:05 crc kubenswrapper[4750]: E1008 18:22:05.843688 4750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(6fea9e8d5891bd1845a17bdb4ee886212ba17123851a1c6d10cbf5372b00e118): error adding pod crc-storage_crc-storage-crc-p4srs to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: E1008 18:22:05.843711 4750 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(6fea9e8d5891bd1845a17bdb4ee886212ba17123851a1c6d10cbf5372b00e118): error adding pod crc-storage_crc-storage-crc-p4srs to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:05 crc kubenswrapper[4750]: E1008 18:22:05.843756 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-p4srs_crc-storage(42cddc68-0cc7-4d38-ab9d-cb01de038724)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-p4srs_crc-storage(42cddc68-0cc7-4d38-ab9d-cb01de038724)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(6fea9e8d5891bd1845a17bdb4ee886212ba17123851a1c6d10cbf5372b00e118): error adding pod crc-storage_crc-storage-crc-p4srs to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): failed to send CNI request: Post \\\"http://dummy/cni\\\": EOF: StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="crc-storage/crc-storage-crc-p4srs" podUID="42cddc68-0cc7-4d38-ab9d-cb01de038724" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.137989 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/3.log" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.139982 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovnkube-controller/3.log" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.143344 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovn-acl-logging/0.log" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.143624 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovn-acl-logging/0.log" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.143816 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovn-controller/0.log" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144028 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl7f4_25d63a44-9fd7-4c19-8715-6ddec94d1806/ovn-controller/0.log" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144212 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144351 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" exitCode=0 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144375 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47" exitCode=0 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144385 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a" exitCode=0 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144392 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9" exitCode=0 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144398 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6" exitCode=0 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144404 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76" exitCode=0 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144410 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384" exitCode=143 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144418 4750 generic.go:334] "Generic (PLEG): container finished" podID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerID="ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372" exitCode=143 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144457 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144485 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144496 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144506 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144515 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144528 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144538 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144566 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144572 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144578 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144583 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144588 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144593 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144598 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144603 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144607 4750 scope.go:117] "RemoveContainer" containerID="8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144610 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144719 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144736 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144743 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144749 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144756 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144762 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144767 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144773 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144779 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144785 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144800 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144815 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144823 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144829 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144834 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144840 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144846 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144851 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144857 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144863 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144869 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144878 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" event={"ID":"25d63a44-9fd7-4c19-8715-6ddec94d1806","Type":"ContainerDied","Data":"f3fb66acce2cf8a97dd45af21d66399a5265db61cb8523c7af31e17c2fe8a342"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144887 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144895 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144901 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144908 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144914 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144920 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144925 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144931 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144936 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.144942 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.145753 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/2.log" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.146020 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/1.log" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.146046 4750 generic.go:334] "Generic (PLEG): container finished" podID="cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444" containerID="848a1d5ba28488cc14f38af4ab07698a61b006fdb10eea6b5fd3da909bf89bdc" exitCode=2 Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.146108 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.146509 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.146910 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzb5c" event={"ID":"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444","Type":"ContainerDied","Data":"848a1d5ba28488cc14f38af4ab07698a61b006fdb10eea6b5fd3da909bf89bdc"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.146925 4750 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e"} Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.147128 4750 scope.go:117] "RemoveContainer" containerID="848a1d5ba28488cc14f38af4ab07698a61b006fdb10eea6b5fd3da909bf89bdc" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.147265 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mzb5c_openshift-multus(cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444)\"" pod="openshift-multus/multus-mzb5c" podUID="cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.163517 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.182750 4750 scope.go:117] "RemoveContainer" containerID="621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.187877 4750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(d6799659f9460de4ec22ece11c4087ea645b6d977d7af6c433f98adcc98d7962): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.187952 4750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(d6799659f9460de4ec22ece11c4087ea645b6d977d7af6c433f98adcc98d7962): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.187995 4750 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(d6799659f9460de4ec22ece11c4087ea645b6d977d7af6c433f98adcc98d7962): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.188048 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-p4srs_crc-storage(42cddc68-0cc7-4d38-ab9d-cb01de038724)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-p4srs_crc-storage(42cddc68-0cc7-4d38-ab9d-cb01de038724)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(d6799659f9460de4ec22ece11c4087ea645b6d977d7af6c433f98adcc98d7962): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-p4srs" podUID="42cddc68-0cc7-4d38-ab9d-cb01de038724" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196589 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vkkv6"] Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196801 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="northd" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196818 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="northd" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196827 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196833 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196842 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kubecfg-setup" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196848 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kubecfg-setup" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196856 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovn-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196862 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovn-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196869 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="sbdb" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196874 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="sbdb" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196882 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196887 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196896 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196901 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196910 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kube-rbac-proxy-node" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196917 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kube-rbac-proxy-node" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196925 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196930 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196938 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196943 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196952 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovn-acl-logging" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196959 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovn-acl-logging" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.196970 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="nbdb" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.196975 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="nbdb" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197099 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197111 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197121 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovn-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197130 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="northd" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197142 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197151 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197162 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="nbdb" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197171 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197185 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovn-acl-logging" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197194 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="sbdb" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197203 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="kube-rbac-proxy-node" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.197323 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197334 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.197454 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" containerName="ovnkube-controller" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.199870 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210705 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-env-overrides\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210747 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-var-lib-cni-networks-ovn-kubernetes\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210794 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-etc-openvswitch\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210819 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-bin\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210853 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-log-socket\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210869 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-node-log\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210895 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovn-node-metrics-cert\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210924 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-slash\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210943 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-netns\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210968 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-var-lib-openvswitch\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210994 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-kubelet\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.210999 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211031 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-script-lib\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211058 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211071 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sxl9\" (UniqueName: \"kubernetes.io/projected/25d63a44-9fd7-4c19-8715-6ddec94d1806-kube-api-access-8sxl9\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211073 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-node-log" (OuterVolumeSpecName: "node-log") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211099 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211095 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-netd\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211139 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211143 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-slash" (OuterVolumeSpecName: "host-slash") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211165 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-systemd\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211179 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211179 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211201 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211202 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-ovn-kubernetes\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211222 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211242 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-systemd-units\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211252 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-log-socket" (OuterVolumeSpecName: "log-socket") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211319 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-config\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211345 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-openvswitch\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211371 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-ovn\") pod \"25d63a44-9fd7-4c19-8715-6ddec94d1806\" (UID: \"25d63a44-9fd7-4c19-8715-6ddec94d1806\") " Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211613 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-log-socket\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211651 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-cni-bin\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211671 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-env-overrides\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211696 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-cni-netd\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211717 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-run-netns\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211741 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-ovn\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211764 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211856 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211888 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-node-log\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211922 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-ovnkube-script-lib\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211958 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e488347d-65cf-481e-bcad-cd8e47e6e139-ovn-node-metrics-cert\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211983 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-kubelet\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212012 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-etc-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212057 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-run-ovn-kubernetes\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212093 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-slash\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212120 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz6q\" (UniqueName: \"kubernetes.io/projected/e488347d-65cf-481e-bcad-cd8e47e6e139-kube-api-access-mtz6q\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212139 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-systemd-units\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212160 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-var-lib-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212181 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-ovnkube-config\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212210 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-systemd\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212254 4750 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212269 4750 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212281 4750 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212293 4750 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-log-socket\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212306 4750 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-node-log\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212318 4750 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-slash\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212329 4750 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212341 4750 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212353 4750 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212365 4750 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212379 4750 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211764 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211889 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211958 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.211995 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.212706 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.213786 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.214883 4750 scope.go:117] "RemoveContainer" containerID="48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.218076 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d63a44-9fd7-4c19-8715-6ddec94d1806-kube-api-access-8sxl9" (OuterVolumeSpecName: "kube-api-access-8sxl9") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "kube-api-access-8sxl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.219600 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.227746 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "25d63a44-9fd7-4c19-8715-6ddec94d1806" (UID: "25d63a44-9fd7-4c19-8715-6ddec94d1806"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.233927 4750 scope.go:117] "RemoveContainer" containerID="234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.244808 4750 scope.go:117] "RemoveContainer" containerID="a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.257501 4750 scope.go:117] "RemoveContainer" containerID="7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.270204 4750 scope.go:117] "RemoveContainer" containerID="fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.286373 4750 scope.go:117] "RemoveContainer" containerID="ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.299902 4750 scope.go:117] "RemoveContainer" containerID="56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.311321 4750 scope.go:117] "RemoveContainer" containerID="8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.311754 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": container with ID starting with 8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53 not found: ID does not exist" containerID="8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.311838 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} err="failed to get container status \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": rpc error: code = NotFound desc = could not find container \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": container with ID starting with 8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.311880 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.312371 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": container with ID starting with ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd not found: ID does not exist" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312408 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} err="failed to get container status \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": rpc error: code = NotFound desc = could not find container \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": container with ID starting with ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312436 4750 scope.go:117] "RemoveContainer" containerID="621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312673 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-log-socket\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312710 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-env-overrides\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312730 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-cni-bin\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312744 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-cni-netd\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312745 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-log-socket\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312759 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-run-netns\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312773 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-ovn\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312793 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312802 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-run-netns\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312819 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312825 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-cni-netd\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312840 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-node-log\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312844 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-ovn\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312874 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312878 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312882 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-ovnkube-script-lib\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312903 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-node-log\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312928 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-kubelet\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312910 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-kubelet\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312956 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e488347d-65cf-481e-bcad-cd8e47e6e139-ovn-node-metrics-cert\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312904 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-cni-bin\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.312973 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-etc-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313002 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-run-ovn-kubernetes\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313017 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-slash\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313035 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz6q\" (UniqueName: \"kubernetes.io/projected/e488347d-65cf-481e-bcad-cd8e47e6e139-kube-api-access-mtz6q\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313037 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-etc-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313050 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-ovnkube-config\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313066 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-systemd-units\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313081 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-var-lib-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313098 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-systemd-units\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313103 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-systemd\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313144 4750 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313155 4750 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313164 4750 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313174 4750 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313183 4750 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313195 4750 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25d63a44-9fd7-4c19-8715-6ddec94d1806-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313204 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sxl9\" (UniqueName: \"kubernetes.io/projected/25d63a44-9fd7-4c19-8715-6ddec94d1806-kube-api-access-8sxl9\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313211 4750 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313220 4750 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25d63a44-9fd7-4c19-8715-6ddec94d1806-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313242 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-run-systemd\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313066 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-slash\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313268 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-var-lib-openvswitch\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313082 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e488347d-65cf-481e-bcad-cd8e47e6e139-host-run-ovn-kubernetes\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.313345 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": container with ID starting with 621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47 not found: ID does not exist" containerID="621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313370 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} err="failed to get container status \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": rpc error: code = NotFound desc = could not find container \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": container with ID starting with 621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313391 4750 scope.go:117] "RemoveContainer" containerID="48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313476 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-env-overrides\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313764 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-ovnkube-script-lib\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.313863 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e488347d-65cf-481e-bcad-cd8e47e6e139-ovnkube-config\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.314663 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": container with ID starting with 48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a not found: ID does not exist" containerID="48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.314688 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} err="failed to get container status \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": rpc error: code = NotFound desc = could not find container \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": container with ID starting with 48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.314705 4750 scope.go:117] "RemoveContainer" containerID="234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.315520 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": container with ID starting with 234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9 not found: ID does not exist" containerID="234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.315549 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} err="failed to get container status \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": rpc error: code = NotFound desc = could not find container \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": container with ID starting with 234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.315582 4750 scope.go:117] "RemoveContainer" containerID="a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.315831 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": container with ID starting with a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6 not found: ID does not exist" containerID="a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.315847 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} err="failed to get container status \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": rpc error: code = NotFound desc = could not find container \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": container with ID starting with a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.315858 4750 scope.go:117] "RemoveContainer" containerID="7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.315865 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e488347d-65cf-481e-bcad-cd8e47e6e139-ovn-node-metrics-cert\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.316103 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": container with ID starting with 7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76 not found: ID does not exist" containerID="7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.316129 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} err="failed to get container status \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": rpc error: code = NotFound desc = could not find container \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": container with ID starting with 7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.316148 4750 scope.go:117] "RemoveContainer" containerID="fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.316429 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": container with ID starting with fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384 not found: ID does not exist" containerID="fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.316470 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} err="failed to get container status \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": rpc error: code = NotFound desc = could not find container \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": container with ID starting with fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.316502 4750 scope.go:117] "RemoveContainer" containerID="ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.316957 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": container with ID starting with ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372 not found: ID does not exist" containerID="ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.316986 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} err="failed to get container status \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": rpc error: code = NotFound desc = could not find container \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": container with ID starting with ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.317004 4750 scope.go:117] "RemoveContainer" containerID="56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d" Oct 08 18:22:06 crc kubenswrapper[4750]: E1008 18:22:06.317364 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": container with ID starting with 56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d not found: ID does not exist" containerID="56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.317393 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} err="failed to get container status \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": rpc error: code = NotFound desc = could not find container \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": container with ID starting with 56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.317409 4750 scope.go:117] "RemoveContainer" containerID="8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.317730 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} err="failed to get container status \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": rpc error: code = NotFound desc = could not find container \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": container with ID starting with 8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.317771 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.318243 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} err="failed to get container status \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": rpc error: code = NotFound desc = could not find container \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": container with ID starting with ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.318298 4750 scope.go:117] "RemoveContainer" containerID="621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.318609 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} err="failed to get container status \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": rpc error: code = NotFound desc = could not find container \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": container with ID starting with 621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.318630 4750 scope.go:117] "RemoveContainer" containerID="48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.319146 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} err="failed to get container status \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": rpc error: code = NotFound desc = could not find container \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": container with ID starting with 48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.319170 4750 scope.go:117] "RemoveContainer" containerID="234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.319435 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} err="failed to get container status \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": rpc error: code = NotFound desc = could not find container \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": container with ID starting with 234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.319457 4750 scope.go:117] "RemoveContainer" containerID="a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.319739 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} err="failed to get container status \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": rpc error: code = NotFound desc = could not find container \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": container with ID starting with a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.319795 4750 scope.go:117] "RemoveContainer" containerID="7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.320036 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} err="failed to get container status \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": rpc error: code = NotFound desc = could not find container \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": container with ID starting with 7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.320058 4750 scope.go:117] "RemoveContainer" containerID="fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.320295 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} err="failed to get container status \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": rpc error: code = NotFound desc = could not find container \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": container with ID starting with fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.320321 4750 scope.go:117] "RemoveContainer" containerID="ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.321523 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} err="failed to get container status \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": rpc error: code = NotFound desc = could not find container \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": container with ID starting with ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.321569 4750 scope.go:117] "RemoveContainer" containerID="56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.321820 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} err="failed to get container status \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": rpc error: code = NotFound desc = could not find container \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": container with ID starting with 56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.321844 4750 scope.go:117] "RemoveContainer" containerID="8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.322093 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} err="failed to get container status \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": rpc error: code = NotFound desc = could not find container \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": container with ID starting with 8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.322267 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.322618 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} err="failed to get container status \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": rpc error: code = NotFound desc = could not find container \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": container with ID starting with ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.322638 4750 scope.go:117] "RemoveContainer" containerID="621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.322899 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} err="failed to get container status \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": rpc error: code = NotFound desc = could not find container \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": container with ID starting with 621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.322913 4750 scope.go:117] "RemoveContainer" containerID="48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.323159 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} err="failed to get container status \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": rpc error: code = NotFound desc = could not find container \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": container with ID starting with 48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.323172 4750 scope.go:117] "RemoveContainer" containerID="234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.323401 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} err="failed to get container status \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": rpc error: code = NotFound desc = could not find container \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": container with ID starting with 234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.323418 4750 scope.go:117] "RemoveContainer" containerID="a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.323715 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} err="failed to get container status \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": rpc error: code = NotFound desc = could not find container \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": container with ID starting with a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.323731 4750 scope.go:117] "RemoveContainer" containerID="7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.323987 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} err="failed to get container status \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": rpc error: code = NotFound desc = could not find container \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": container with ID starting with 7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.324003 4750 scope.go:117] "RemoveContainer" containerID="fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.324720 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} err="failed to get container status \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": rpc error: code = NotFound desc = could not find container \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": container with ID starting with fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.324746 4750 scope.go:117] "RemoveContainer" containerID="ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.325025 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} err="failed to get container status \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": rpc error: code = NotFound desc = could not find container \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": container with ID starting with ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.325047 4750 scope.go:117] "RemoveContainer" containerID="56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.325353 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} err="failed to get container status \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": rpc error: code = NotFound desc = could not find container \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": container with ID starting with 56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.325371 4750 scope.go:117] "RemoveContainer" containerID="8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.325586 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} err="failed to get container status \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": rpc error: code = NotFound desc = could not find container \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": container with ID starting with 8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.325606 4750 scope.go:117] "RemoveContainer" containerID="ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.325820 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd"} err="failed to get container status \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": rpc error: code = NotFound desc = could not find container \"ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd\": container with ID starting with ff4be8fc1ad41cb1ad19143f14269fbe77825e622005ecab4d4731e0f3ed6cdd not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.325853 4750 scope.go:117] "RemoveContainer" containerID="621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.326119 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47"} err="failed to get container status \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": rpc error: code = NotFound desc = could not find container \"621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47\": container with ID starting with 621a2573470ccc555bce7d285c9e2993e6f60ef3a94648226a33af265a9bbd47 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.326157 4750 scope.go:117] "RemoveContainer" containerID="48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.326374 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a"} err="failed to get container status \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": rpc error: code = NotFound desc = could not find container \"48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a\": container with ID starting with 48609be00c54bb65ccc3cdc82255e0c6a9b90b72fe96e544b91d6666124d918a not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.326398 4750 scope.go:117] "RemoveContainer" containerID="234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.326635 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9"} err="failed to get container status \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": rpc error: code = NotFound desc = could not find container \"234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9\": container with ID starting with 234a3aae9e366e21ce468764e222a32a1dfc0f173e7409367c61ac60b37adad9 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.326658 4750 scope.go:117] "RemoveContainer" containerID="a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.326923 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6"} err="failed to get container status \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": rpc error: code = NotFound desc = could not find container \"a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6\": container with ID starting with a38cadee5a25558af056cc4a85e6455de521706c548bc8a4dfe21ad87c4044d6 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.326950 4750 scope.go:117] "RemoveContainer" containerID="7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.327402 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76"} err="failed to get container status \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": rpc error: code = NotFound desc = could not find container \"7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76\": container with ID starting with 7fdb54ffdfe92bac0251abc103900e72679e0d2d1f5db7472b2970c1f1a95e76 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.327428 4750 scope.go:117] "RemoveContainer" containerID="fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.327743 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384"} err="failed to get container status \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": rpc error: code = NotFound desc = could not find container \"fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384\": container with ID starting with fee860458e87b87c470d23ba71c66004e046cca4946dc37eba8e0d015a953384 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.327768 4750 scope.go:117] "RemoveContainer" containerID="ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.328020 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372"} err="failed to get container status \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": rpc error: code = NotFound desc = could not find container \"ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372\": container with ID starting with ef52de87e4fdb8b8f53be59f329e2769104b3d576b0f4c7842f58afa0ce9e372 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.328048 4750 scope.go:117] "RemoveContainer" containerID="56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.328240 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz6q\" (UniqueName: \"kubernetes.io/projected/e488347d-65cf-481e-bcad-cd8e47e6e139-kube-api-access-mtz6q\") pod \"ovnkube-node-vkkv6\" (UID: \"e488347d-65cf-481e-bcad-cd8e47e6e139\") " pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.328257 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d"} err="failed to get container status \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": rpc error: code = NotFound desc = could not find container \"56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d\": container with ID starting with 56139024e410dd6e34d0286edc4e05e9d7f5a7eb08e1236302d666ffdc48d64d not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.328324 4750 scope.go:117] "RemoveContainer" containerID="8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.328838 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53"} err="failed to get container status \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": rpc error: code = NotFound desc = could not find container \"8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53\": container with ID starting with 8f19195c19ae150418ac7dee9a0af05a97794e1f8e3f7361478405c08cafad53 not found: ID does not exist" Oct 08 18:22:06 crc kubenswrapper[4750]: I1008 18:22:06.519950 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:07 crc kubenswrapper[4750]: I1008 18:22:07.152385 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rl7f4" Oct 08 18:22:07 crc kubenswrapper[4750]: I1008 18:22:07.155261 4750 generic.go:334] "Generic (PLEG): container finished" podID="e488347d-65cf-481e-bcad-cd8e47e6e139" containerID="63859a74f2453789af82cdd152db53fcd7ffc3b823c8e546f2f675aec51333f6" exitCode=0 Oct 08 18:22:07 crc kubenswrapper[4750]: I1008 18:22:07.155352 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerDied","Data":"63859a74f2453789af82cdd152db53fcd7ffc3b823c8e546f2f675aec51333f6"} Oct 08 18:22:07 crc kubenswrapper[4750]: I1008 18:22:07.155485 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"3fe390968693b15ef106c77526a701a1a3d2cff811c52258ceb4dba5bcdbd132"} Oct 08 18:22:07 crc kubenswrapper[4750]: I1008 18:22:07.215219 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rl7f4"] Oct 08 18:22:07 crc kubenswrapper[4750]: I1008 18:22:07.219024 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rl7f4"] Oct 08 18:22:08 crc kubenswrapper[4750]: I1008 18:22:08.167745 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"c364a05a0a0f6dd4caceb54b508ee3f1296c142922695b93191691405c68a706"} Oct 08 18:22:08 crc kubenswrapper[4750]: I1008 18:22:08.169528 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"5e2b4469ad0dd03e6576ad6462648d9d7a331cd50f728972f85a1e3ac829a531"} Oct 08 18:22:08 crc kubenswrapper[4750]: I1008 18:22:08.169601 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"bb1fb92150d7874e55930d6d9e7bee8ff50c57cebf01ce65b4cb0320f593caf5"} Oct 08 18:22:08 crc kubenswrapper[4750]: I1008 18:22:08.169624 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"f99453279a3ab6cfb1bdf54880b978f89a65d8fe476fa02888fc32ec3e1b0b50"} Oct 08 18:22:08 crc kubenswrapper[4750]: I1008 18:22:08.169650 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"8a2b86002ff3bd2f7a0484ee09fc0e95eb84e48d93b9a9cfe24ac6acc42301e3"} Oct 08 18:22:08 crc kubenswrapper[4750]: I1008 18:22:08.169671 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"e4189f848be823111ea560d3cee924a4192992ffc61917cb90918f0899ae9610"} Oct 08 18:22:08 crc kubenswrapper[4750]: I1008 18:22:08.746682 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d63a44-9fd7-4c19-8715-6ddec94d1806" path="/var/lib/kubelet/pods/25d63a44-9fd7-4c19-8715-6ddec94d1806/volumes" Oct 08 18:22:10 crc kubenswrapper[4750]: I1008 18:22:10.184923 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"366a7c5ad8b9aedaf4b5fcbaad505fb049f20b19032f620aa29ac47a18b79b39"} Oct 08 18:22:13 crc kubenswrapper[4750]: I1008 18:22:13.206201 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" event={"ID":"e488347d-65cf-481e-bcad-cd8e47e6e139","Type":"ContainerStarted","Data":"12faebcff7a32a271bba3384434294d34ab76cde868c684db814acf47fdcf9ad"} Oct 08 18:22:13 crc kubenswrapper[4750]: I1008 18:22:13.206860 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:13 crc kubenswrapper[4750]: I1008 18:22:13.243889 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" podStartSLOduration=7.243871532 podStartE2EDuration="7.243871532s" podCreationTimestamp="2025-10-08 18:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:22:13.239465663 +0000 UTC m=+689.152436696" watchObservedRunningTime="2025-10-08 18:22:13.243871532 +0000 UTC m=+689.156842565" Oct 08 18:22:13 crc kubenswrapper[4750]: I1008 18:22:13.255068 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:14 crc kubenswrapper[4750]: I1008 18:22:14.215317 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:14 crc kubenswrapper[4750]: I1008 18:22:14.215356 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:14 crc kubenswrapper[4750]: I1008 18:22:14.245269 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:17 crc kubenswrapper[4750]: I1008 18:22:17.734578 4750 scope.go:117] "RemoveContainer" containerID="848a1d5ba28488cc14f38af4ab07698a61b006fdb10eea6b5fd3da909bf89bdc" Oct 08 18:22:17 crc kubenswrapper[4750]: E1008 18:22:17.734969 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mzb5c_openshift-multus(cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444)\"" pod="openshift-multus/multus-mzb5c" podUID="cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444" Oct 08 18:22:18 crc kubenswrapper[4750]: I1008 18:22:18.733640 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:18 crc kubenswrapper[4750]: I1008 18:22:18.734406 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:18 crc kubenswrapper[4750]: E1008 18:22:18.766256 4750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(137c97dcc34b0febeebce83c9caf6385c6970d2ec4260674bce720746c19dcea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 18:22:18 crc kubenswrapper[4750]: E1008 18:22:18.766679 4750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(137c97dcc34b0febeebce83c9caf6385c6970d2ec4260674bce720746c19dcea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:18 crc kubenswrapper[4750]: E1008 18:22:18.766717 4750 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(137c97dcc34b0febeebce83c9caf6385c6970d2ec4260674bce720746c19dcea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:18 crc kubenswrapper[4750]: E1008 18:22:18.766783 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-p4srs_crc-storage(42cddc68-0cc7-4d38-ab9d-cb01de038724)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-p4srs_crc-storage(42cddc68-0cc7-4d38-ab9d-cb01de038724)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p4srs_crc-storage_42cddc68-0cc7-4d38-ab9d-cb01de038724_0(137c97dcc34b0febeebce83c9caf6385c6970d2ec4260674bce720746c19dcea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-p4srs" podUID="42cddc68-0cc7-4d38-ab9d-cb01de038724" Oct 08 18:22:29 crc kubenswrapper[4750]: I1008 18:22:29.706795 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:22:29 crc kubenswrapper[4750]: I1008 18:22:29.707515 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:22:29 crc kubenswrapper[4750]: I1008 18:22:29.734464 4750 scope.go:117] "RemoveContainer" containerID="848a1d5ba28488cc14f38af4ab07698a61b006fdb10eea6b5fd3da909bf89bdc" Oct 08 18:22:30 crc kubenswrapper[4750]: I1008 18:22:30.312804 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/2.log" Oct 08 18:22:30 crc kubenswrapper[4750]: I1008 18:22:30.313488 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/1.log" Oct 08 18:22:30 crc kubenswrapper[4750]: I1008 18:22:30.313523 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzb5c" event={"ID":"cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444","Type":"ContainerStarted","Data":"67bb603d1f5756e9ad46e363dde21a7ce21648ec9c2716f1d2432970af85898f"} Oct 08 18:22:31 crc kubenswrapper[4750]: I1008 18:22:31.733621 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:31 crc kubenswrapper[4750]: I1008 18:22:31.734858 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:31 crc kubenswrapper[4750]: I1008 18:22:31.955822 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p4srs"] Oct 08 18:22:31 crc kubenswrapper[4750]: I1008 18:22:31.965640 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 18:22:32 crc kubenswrapper[4750]: I1008 18:22:32.325367 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p4srs" event={"ID":"42cddc68-0cc7-4d38-ab9d-cb01de038724","Type":"ContainerStarted","Data":"891cc877d2949a523965edf8560d42e47d3610d34272049616fc23939b8bc6a7"} Oct 08 18:22:33 crc kubenswrapper[4750]: I1008 18:22:33.331142 4750 generic.go:334] "Generic (PLEG): container finished" podID="42cddc68-0cc7-4d38-ab9d-cb01de038724" containerID="a40408f8da3c343fa2267b44a9121e1dff8310915a7c556e894ca9bb049a70d0" exitCode=0 Oct 08 18:22:33 crc kubenswrapper[4750]: I1008 18:22:33.331209 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p4srs" event={"ID":"42cddc68-0cc7-4d38-ab9d-cb01de038724","Type":"ContainerDied","Data":"a40408f8da3c343fa2267b44a9121e1dff8310915a7c556e894ca9bb049a70d0"} Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.536489 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.622766 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42cddc68-0cc7-4d38-ab9d-cb01de038724-crc-storage\") pod \"42cddc68-0cc7-4d38-ab9d-cb01de038724\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.622820 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42cddc68-0cc7-4d38-ab9d-cb01de038724-node-mnt\") pod \"42cddc68-0cc7-4d38-ab9d-cb01de038724\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.622938 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp9n8\" (UniqueName: \"kubernetes.io/projected/42cddc68-0cc7-4d38-ab9d-cb01de038724-kube-api-access-vp9n8\") pod \"42cddc68-0cc7-4d38-ab9d-cb01de038724\" (UID: \"42cddc68-0cc7-4d38-ab9d-cb01de038724\") " Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.623291 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42cddc68-0cc7-4d38-ab9d-cb01de038724-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "42cddc68-0cc7-4d38-ab9d-cb01de038724" (UID: "42cddc68-0cc7-4d38-ab9d-cb01de038724"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.627119 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cddc68-0cc7-4d38-ab9d-cb01de038724-kube-api-access-vp9n8" (OuterVolumeSpecName: "kube-api-access-vp9n8") pod "42cddc68-0cc7-4d38-ab9d-cb01de038724" (UID: "42cddc68-0cc7-4d38-ab9d-cb01de038724"). InnerVolumeSpecName "kube-api-access-vp9n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.635237 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cddc68-0cc7-4d38-ab9d-cb01de038724-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "42cddc68-0cc7-4d38-ab9d-cb01de038724" (UID: "42cddc68-0cc7-4d38-ab9d-cb01de038724"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.724414 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp9n8\" (UniqueName: \"kubernetes.io/projected/42cddc68-0cc7-4d38-ab9d-cb01de038724-kube-api-access-vp9n8\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.724448 4750 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/42cddc68-0cc7-4d38-ab9d-cb01de038724-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:34 crc kubenswrapper[4750]: I1008 18:22:34.724457 4750 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/42cddc68-0cc7-4d38-ab9d-cb01de038724-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:35 crc kubenswrapper[4750]: I1008 18:22:35.340897 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p4srs" event={"ID":"42cddc68-0cc7-4d38-ab9d-cb01de038724","Type":"ContainerDied","Data":"891cc877d2949a523965edf8560d42e47d3610d34272049616fc23939b8bc6a7"} Oct 08 18:22:35 crc kubenswrapper[4750]: I1008 18:22:35.340936 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="891cc877d2949a523965edf8560d42e47d3610d34272049616fc23939b8bc6a7" Oct 08 18:22:35 crc kubenswrapper[4750]: I1008 18:22:35.340968 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4srs" Oct 08 18:22:36 crc kubenswrapper[4750]: I1008 18:22:36.543997 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vkkv6" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.166152 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9"] Oct 08 18:22:43 crc kubenswrapper[4750]: E1008 18:22:43.170084 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cddc68-0cc7-4d38-ab9d-cb01de038724" containerName="storage" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.170125 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cddc68-0cc7-4d38-ab9d-cb01de038724" containerName="storage" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.170298 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cddc68-0cc7-4d38-ab9d-cb01de038724" containerName="storage" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.171410 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.174002 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.178452 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9"] Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.232690 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wptdg\" (UniqueName: \"kubernetes.io/projected/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-kube-api-access-wptdg\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.232746 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.233092 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.334246 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wptdg\" (UniqueName: \"kubernetes.io/projected/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-kube-api-access-wptdg\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.334382 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.334994 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.335110 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.335404 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.361349 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wptdg\" (UniqueName: \"kubernetes.io/projected/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-kube-api-access-wptdg\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.491942 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:43 crc kubenswrapper[4750]: I1008 18:22:43.924676 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9"] Oct 08 18:22:44 crc kubenswrapper[4750]: I1008 18:22:44.401148 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" event={"ID":"0073f23d-f9bf-4c55-b2d1-8106cde0bf97","Type":"ContainerStarted","Data":"100185b72d16407bdcef7a17d4ae15391af0ed5525eca1ea9eae7b41266e874e"} Oct 08 18:22:44 crc kubenswrapper[4750]: I1008 18:22:44.401464 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" event={"ID":"0073f23d-f9bf-4c55-b2d1-8106cde0bf97","Type":"ContainerStarted","Data":"56d1a5b1109a08a946c7ffc207f4966ca1bfb910087c8a85fd7ed5870d9ea1fc"} Oct 08 18:22:45 crc kubenswrapper[4750]: I1008 18:22:45.006748 4750 scope.go:117] "RemoveContainer" containerID="da637e4e5c4e6a32490a03baffc399a92670a4d4e08bf3f7f23f23ab4aed5c1e" Oct 08 18:22:45 crc kubenswrapper[4750]: I1008 18:22:45.407388 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzb5c_cd6d64f0-b369-4d7c-a70e-e5fc0a4e7444/kube-multus/2.log" Oct 08 18:22:45 crc kubenswrapper[4750]: I1008 18:22:45.409197 4750 generic.go:334] "Generic (PLEG): container finished" podID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerID="100185b72d16407bdcef7a17d4ae15391af0ed5525eca1ea9eae7b41266e874e" exitCode=0 Oct 08 18:22:45 crc kubenswrapper[4750]: I1008 18:22:45.409245 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" event={"ID":"0073f23d-f9bf-4c55-b2d1-8106cde0bf97","Type":"ContainerDied","Data":"100185b72d16407bdcef7a17d4ae15391af0ed5525eca1ea9eae7b41266e874e"} Oct 08 18:22:47 crc kubenswrapper[4750]: I1008 18:22:47.423116 4750 generic.go:334] "Generic (PLEG): container finished" podID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerID="52975d1f4b1e5dfbe919e3bf6adcb3b29d42f30108f98055e2e6374d6e558752" exitCode=0 Oct 08 18:22:47 crc kubenswrapper[4750]: I1008 18:22:47.423623 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" event={"ID":"0073f23d-f9bf-4c55-b2d1-8106cde0bf97","Type":"ContainerDied","Data":"52975d1f4b1e5dfbe919e3bf6adcb3b29d42f30108f98055e2e6374d6e558752"} Oct 08 18:22:48 crc kubenswrapper[4750]: I1008 18:22:48.430686 4750 generic.go:334] "Generic (PLEG): container finished" podID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerID="847a8755402368802a5bf55121d1879b53fb0dcdfa41d0f48f4812744e8351a5" exitCode=0 Oct 08 18:22:48 crc kubenswrapper[4750]: I1008 18:22:48.430786 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" event={"ID":"0073f23d-f9bf-4c55-b2d1-8106cde0bf97","Type":"ContainerDied","Data":"847a8755402368802a5bf55121d1879b53fb0dcdfa41d0f48f4812744e8351a5"} Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.638245 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.828035 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wptdg\" (UniqueName: \"kubernetes.io/projected/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-kube-api-access-wptdg\") pod \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.828384 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-bundle\") pod \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.828415 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-util\") pod \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\" (UID: \"0073f23d-f9bf-4c55-b2d1-8106cde0bf97\") " Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.829334 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-bundle" (OuterVolumeSpecName: "bundle") pod "0073f23d-f9bf-4c55-b2d1-8106cde0bf97" (UID: "0073f23d-f9bf-4c55-b2d1-8106cde0bf97"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.833471 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-kube-api-access-wptdg" (OuterVolumeSpecName: "kube-api-access-wptdg") pod "0073f23d-f9bf-4c55-b2d1-8106cde0bf97" (UID: "0073f23d-f9bf-4c55-b2d1-8106cde0bf97"). InnerVolumeSpecName "kube-api-access-wptdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.838703 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-util" (OuterVolumeSpecName: "util") pod "0073f23d-f9bf-4c55-b2d1-8106cde0bf97" (UID: "0073f23d-f9bf-4c55-b2d1-8106cde0bf97"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.929291 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wptdg\" (UniqueName: \"kubernetes.io/projected/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-kube-api-access-wptdg\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.929323 4750 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:49 crc kubenswrapper[4750]: I1008 18:22:49.929332 4750 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0073f23d-f9bf-4c55-b2d1-8106cde0bf97-util\") on node \"crc\" DevicePath \"\"" Oct 08 18:22:50 crc kubenswrapper[4750]: I1008 18:22:50.443842 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" event={"ID":"0073f23d-f9bf-4c55-b2d1-8106cde0bf97","Type":"ContainerDied","Data":"56d1a5b1109a08a946c7ffc207f4966ca1bfb910087c8a85fd7ed5870d9ea1fc"} Oct 08 18:22:50 crc kubenswrapper[4750]: I1008 18:22:50.443897 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d1a5b1109a08a946c7ffc207f4966ca1bfb910087c8a85fd7ed5870d9ea1fc" Oct 08 18:22:50 crc kubenswrapper[4750]: I1008 18:22:50.443972 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.194608 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl"] Oct 08 18:22:52 crc kubenswrapper[4750]: E1008 18:22:52.194796 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerName="pull" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.194807 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerName="pull" Oct 08 18:22:52 crc kubenswrapper[4750]: E1008 18:22:52.194818 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerName="util" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.194824 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerName="util" Oct 08 18:22:52 crc kubenswrapper[4750]: E1008 18:22:52.194840 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerName="extract" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.194848 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerName="extract" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.194933 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="0073f23d-f9bf-4c55-b2d1-8106cde0bf97" containerName="extract" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.204595 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.208688 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-j4q2j" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.216096 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.216088 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.221060 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl"] Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.353420 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjdll\" (UniqueName: \"kubernetes.io/projected/2915c99e-fa94-482d-ab1d-c506bc61b0ed-kube-api-access-cjdll\") pod \"nmstate-operator-858ddd8f98-bjcrl\" (UID: \"2915c99e-fa94-482d-ab1d-c506bc61b0ed\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.454114 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjdll\" (UniqueName: \"kubernetes.io/projected/2915c99e-fa94-482d-ab1d-c506bc61b0ed-kube-api-access-cjdll\") pod \"nmstate-operator-858ddd8f98-bjcrl\" (UID: \"2915c99e-fa94-482d-ab1d-c506bc61b0ed\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.471949 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjdll\" (UniqueName: \"kubernetes.io/projected/2915c99e-fa94-482d-ab1d-c506bc61b0ed-kube-api-access-cjdll\") pod \"nmstate-operator-858ddd8f98-bjcrl\" (UID: \"2915c99e-fa94-482d-ab1d-c506bc61b0ed\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.526830 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl" Oct 08 18:22:52 crc kubenswrapper[4750]: I1008 18:22:52.712682 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl"] Oct 08 18:22:53 crc kubenswrapper[4750]: I1008 18:22:53.458724 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl" event={"ID":"2915c99e-fa94-482d-ab1d-c506bc61b0ed","Type":"ContainerStarted","Data":"79a2f642e7cbd15f07eb8c543a28e2d6ddf1327ddeccaf556592d13f61bcf232"} Oct 08 18:22:55 crc kubenswrapper[4750]: I1008 18:22:55.472318 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl" event={"ID":"2915c99e-fa94-482d-ab1d-c506bc61b0ed","Type":"ContainerStarted","Data":"0cf78935bd6a2e660f95bf7d480be808dbcd252d4af630044ce2ebae2021f7f9"} Oct 08 18:22:55 crc kubenswrapper[4750]: I1008 18:22:55.492146 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bjcrl" podStartSLOduration=0.92693137 podStartE2EDuration="3.492124559s" podCreationTimestamp="2025-10-08 18:22:52 +0000 UTC" firstStartedPulling="2025-10-08 18:22:52.718202424 +0000 UTC m=+728.631173437" lastFinishedPulling="2025-10-08 18:22:55.283395613 +0000 UTC m=+731.196366626" observedRunningTime="2025-10-08 18:22:55.490056615 +0000 UTC m=+731.403027628" watchObservedRunningTime="2025-10-08 18:22:55.492124559 +0000 UTC m=+731.405095612" Oct 08 18:22:59 crc kubenswrapper[4750]: I1008 18:22:59.707716 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:22:59 crc kubenswrapper[4750]: I1008 18:22:59.708067 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.487720 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.489910 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.493827 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nltgk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.497580 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.500015 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.500887 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.505137 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.529993 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.533366 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zgswq"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.534324 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.588714 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskw5\" (UniqueName: \"kubernetes.io/projected/eb4084db-bdd2-4c22-9eaa-0919b73e0977-kube-api-access-fskw5\") pod \"nmstate-metrics-fdff9cb8d-8j7zl\" (UID: \"eb4084db-bdd2-4c22-9eaa-0919b73e0977\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.623059 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.623947 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.626848 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-zlgkd" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.627177 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.627577 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.634138 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.689579 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-nmstate-lock\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.689623 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-ovs-socket\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.689652 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2kj\" (UniqueName: \"kubernetes.io/projected/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-kube-api-access-4z2kj\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.689837 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bpl7\" (UniqueName: \"kubernetes.io/projected/157527e2-af03-406e-8757-abd19563422d-kube-api-access-4bpl7\") pod \"nmstate-webhook-6cdbc54649-pzxbn\" (UID: \"157527e2-af03-406e-8757-abd19563422d\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.689910 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-dbus-socket\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.689962 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/157527e2-af03-406e-8757-abd19563422d-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-pzxbn\" (UID: \"157527e2-af03-406e-8757-abd19563422d\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.690029 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskw5\" (UniqueName: \"kubernetes.io/projected/eb4084db-bdd2-4c22-9eaa-0919b73e0977-kube-api-access-fskw5\") pod \"nmstate-metrics-fdff9cb8d-8j7zl\" (UID: \"eb4084db-bdd2-4c22-9eaa-0919b73e0977\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.712193 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskw5\" (UniqueName: \"kubernetes.io/projected/eb4084db-bdd2-4c22-9eaa-0919b73e0977-kube-api-access-fskw5\") pod \"nmstate-metrics-fdff9cb8d-8j7zl\" (UID: \"eb4084db-bdd2-4c22-9eaa-0919b73e0977\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792001 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcqn6\" (UniqueName: \"kubernetes.io/projected/2cb56acc-cb75-4140-82db-412d149f4ba0-kube-api-access-bcqn6\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792102 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-nmstate-lock\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792134 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-ovs-socket\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792172 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z2kj\" (UniqueName: \"kubernetes.io/projected/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-kube-api-access-4z2kj\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792217 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bpl7\" (UniqueName: \"kubernetes.io/projected/157527e2-af03-406e-8757-abd19563422d-kube-api-access-4bpl7\") pod \"nmstate-webhook-6cdbc54649-pzxbn\" (UID: \"157527e2-af03-406e-8757-abd19563422d\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792255 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-dbus-socket\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792262 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-nmstate-lock\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: E1008 18:23:03.792445 4750 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792267 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-ovs-socket\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792292 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/157527e2-af03-406e-8757-abd19563422d-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-pzxbn\" (UID: \"157527e2-af03-406e-8757-abd19563422d\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792608 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-dbus-socket\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: E1008 18:23:03.792538 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/157527e2-af03-406e-8757-abd19563422d-tls-key-pair podName:157527e2-af03-406e-8757-abd19563422d nodeName:}" failed. No retries permitted until 2025-10-08 18:23:04.292504531 +0000 UTC m=+740.205475554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/157527e2-af03-406e-8757-abd19563422d-tls-key-pair") pod "nmstate-webhook-6cdbc54649-pzxbn" (UID: "157527e2-af03-406e-8757-abd19563422d") : secret "openshift-nmstate-webhook" not found Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792742 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb56acc-cb75-4140-82db-412d149f4ba0-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.792769 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2cb56acc-cb75-4140-82db-412d149f4ba0-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.812372 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.814772 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z2kj\" (UniqueName: \"kubernetes.io/projected/83a1d5d8-1631-4df6-8b0e-a48e52fd4972-kube-api-access-4z2kj\") pod \"nmstate-handler-zgswq\" (UID: \"83a1d5d8-1631-4df6-8b0e-a48e52fd4972\") " pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.827394 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bpl7\" (UniqueName: \"kubernetes.io/projected/157527e2-af03-406e-8757-abd19563422d-kube-api-access-4bpl7\") pod \"nmstate-webhook-6cdbc54649-pzxbn\" (UID: \"157527e2-af03-406e-8757-abd19563422d\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.831178 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-766b49dfb7-zdlzb"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.832191 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.854454 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-766b49dfb7-zdlzb"] Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.857510 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.894017 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb56acc-cb75-4140-82db-412d149f4ba0-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.894057 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2cb56acc-cb75-4140-82db-412d149f4ba0-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.894090 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcqn6\" (UniqueName: \"kubernetes.io/projected/2cb56acc-cb75-4140-82db-412d149f4ba0-kube-api-access-bcqn6\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.899143 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb56acc-cb75-4140-82db-412d149f4ba0-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.899382 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2cb56acc-cb75-4140-82db-412d149f4ba0-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.917306 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcqn6\" (UniqueName: \"kubernetes.io/projected/2cb56acc-cb75-4140-82db-412d149f4ba0-kube-api-access-bcqn6\") pod \"nmstate-console-plugin-6b874cbd85-v69zk\" (UID: \"2cb56acc-cb75-4140-82db-412d149f4ba0\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:03 crc kubenswrapper[4750]: I1008 18:23:03.939580 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.007864 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-trusted-ca-bundle\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.007966 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-service-ca\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.008022 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-serving-cert\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.008046 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-config\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.008072 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-oauth-config\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.008119 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrw5z\" (UniqueName: \"kubernetes.io/projected/6bdfefd5-db52-4117-9f61-ef80c9f816c0-kube-api-access-wrw5z\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.008143 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-oauth-serving-cert\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.101981 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl"] Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.109890 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-trusted-ca-bundle\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.109964 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-service-ca\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.110014 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-serving-cert\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.110034 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-config\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.110050 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-oauth-config\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.110079 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrw5z\" (UniqueName: \"kubernetes.io/projected/6bdfefd5-db52-4117-9f61-ef80c9f816c0-kube-api-access-wrw5z\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.110100 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-oauth-serving-cert\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.110913 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-oauth-serving-cert\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.111775 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-config\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.111955 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-trusted-ca-bundle\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.112191 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bdfefd5-db52-4117-9f61-ef80c9f816c0-service-ca\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.114480 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-serving-cert\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.115791 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bdfefd5-db52-4117-9f61-ef80c9f816c0-console-oauth-config\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.136839 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrw5z\" (UniqueName: \"kubernetes.io/projected/6bdfefd5-db52-4117-9f61-ef80c9f816c0-kube-api-access-wrw5z\") pod \"console-766b49dfb7-zdlzb\" (UID: \"6bdfefd5-db52-4117-9f61-ef80c9f816c0\") " pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.173060 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk"] Oct 08 18:23:04 crc kubenswrapper[4750]: W1008 18:23:04.176475 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb56acc_cb75_4140_82db_412d149f4ba0.slice/crio-cf9b9d55ecc1cd91be0be0682bf206e8b23407b4364fc4da84e02fc62f61a3ae WatchSource:0}: Error finding container cf9b9d55ecc1cd91be0be0682bf206e8b23407b4364fc4da84e02fc62f61a3ae: Status 404 returned error can't find the container with id cf9b9d55ecc1cd91be0be0682bf206e8b23407b4364fc4da84e02fc62f61a3ae Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.221185 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.312788 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/157527e2-af03-406e-8757-abd19563422d-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-pzxbn\" (UID: \"157527e2-af03-406e-8757-abd19563422d\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.317006 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/157527e2-af03-406e-8757-abd19563422d-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-pzxbn\" (UID: \"157527e2-af03-406e-8757-abd19563422d\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.429373 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.455151 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-766b49dfb7-zdlzb"] Oct 08 18:23:04 crc kubenswrapper[4750]: W1008 18:23:04.462666 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bdfefd5_db52_4117_9f61_ef80c9f816c0.slice/crio-087e030da71ddcfddb474d727a7ae75e856f24b5cdad9f854b59f025b926757c WatchSource:0}: Error finding container 087e030da71ddcfddb474d727a7ae75e856f24b5cdad9f854b59f025b926757c: Status 404 returned error can't find the container with id 087e030da71ddcfddb474d727a7ae75e856f24b5cdad9f854b59f025b926757c Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.546812 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" event={"ID":"eb4084db-bdd2-4c22-9eaa-0919b73e0977","Type":"ContainerStarted","Data":"7d7f8177711a3849350ef8734c65c56a974208cf53a8cda8a791e8a55414de79"} Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.548399 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-766b49dfb7-zdlzb" event={"ID":"6bdfefd5-db52-4117-9f61-ef80c9f816c0","Type":"ContainerStarted","Data":"087e030da71ddcfddb474d727a7ae75e856f24b5cdad9f854b59f025b926757c"} Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.550148 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zgswq" event={"ID":"83a1d5d8-1631-4df6-8b0e-a48e52fd4972","Type":"ContainerStarted","Data":"199db084f96752a767747b47310fb285fb463da145f339e5cb57a4803f4d1f2e"} Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.552115 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" event={"ID":"2cb56acc-cb75-4140-82db-412d149f4ba0","Type":"ContainerStarted","Data":"cf9b9d55ecc1cd91be0be0682bf206e8b23407b4364fc4da84e02fc62f61a3ae"} Oct 08 18:23:04 crc kubenswrapper[4750]: I1008 18:23:04.888576 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn"] Oct 08 18:23:05 crc kubenswrapper[4750]: I1008 18:23:05.569475 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" event={"ID":"157527e2-af03-406e-8757-abd19563422d","Type":"ContainerStarted","Data":"aebd57429b2f4f808726a7616463f1d65fafa88bd7dd8a69ba0eadcba3f2843b"} Oct 08 18:23:05 crc kubenswrapper[4750]: I1008 18:23:05.571856 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-766b49dfb7-zdlzb" event={"ID":"6bdfefd5-db52-4117-9f61-ef80c9f816c0","Type":"ContainerStarted","Data":"786b5ea0c0a24c192a2380e3c8de4134d90512a38cca7bcd222c5cdbf5c1a98a"} Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.586539 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" event={"ID":"157527e2-af03-406e-8757-abd19563422d","Type":"ContainerStarted","Data":"7c451b62830d6148aa4be42bf8f0cda0814c70027ffed5ea4b68bfa63066ae51"} Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.587604 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.596870 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zgswq" event={"ID":"83a1d5d8-1631-4df6-8b0e-a48e52fd4972","Type":"ContainerStarted","Data":"87fd698ad2b580a4ef3ce06fb67913cebd2b4c6659b532cd43a09f3a21775720"} Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.597443 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.600104 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" event={"ID":"2cb56acc-cb75-4140-82db-412d149f4ba0","Type":"ContainerStarted","Data":"100024ae2760c5cebcbc65e072381d48355e5bb02e6f6ca2c54a4a6327cd6476"} Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.602606 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" event={"ID":"eb4084db-bdd2-4c22-9eaa-0919b73e0977","Type":"ContainerStarted","Data":"0316d3c147a0d71e18df4c1a066ab4cccf8b04fee80b08d16ffef7210656e48c"} Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.614158 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-766b49dfb7-zdlzb" podStartSLOduration=4.614139706 podStartE2EDuration="4.614139706s" podCreationTimestamp="2025-10-08 18:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:23:05.593826248 +0000 UTC m=+741.506797291" watchObservedRunningTime="2025-10-08 18:23:07.614139706 +0000 UTC m=+743.527110749" Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.614781 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" podStartSLOduration=2.849224407 podStartE2EDuration="4.614772913s" podCreationTimestamp="2025-10-08 18:23:03 +0000 UTC" firstStartedPulling="2025-10-08 18:23:04.899094868 +0000 UTC m=+740.812065881" lastFinishedPulling="2025-10-08 18:23:06.664643374 +0000 UTC m=+742.577614387" observedRunningTime="2025-10-08 18:23:07.611265002 +0000 UTC m=+743.524236015" watchObservedRunningTime="2025-10-08 18:23:07.614772913 +0000 UTC m=+743.527743956" Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.641840 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-v69zk" podStartSLOduration=2.15971679 podStartE2EDuration="4.641805138s" podCreationTimestamp="2025-10-08 18:23:03 +0000 UTC" firstStartedPulling="2025-10-08 18:23:04.178710707 +0000 UTC m=+740.091681720" lastFinishedPulling="2025-10-08 18:23:06.660799035 +0000 UTC m=+742.573770068" observedRunningTime="2025-10-08 18:23:07.639060868 +0000 UTC m=+743.552031911" watchObservedRunningTime="2025-10-08 18:23:07.641805138 +0000 UTC m=+743.554776201" Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.667974 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zgswq" podStartSLOduration=1.914808774 podStartE2EDuration="4.667946182s" podCreationTimestamp="2025-10-08 18:23:03 +0000 UTC" firstStartedPulling="2025-10-08 18:23:03.908673873 +0000 UTC m=+739.821644886" lastFinishedPulling="2025-10-08 18:23:06.661811281 +0000 UTC m=+742.574782294" observedRunningTime="2025-10-08 18:23:07.665504989 +0000 UTC m=+743.578476012" watchObservedRunningTime="2025-10-08 18:23:07.667946182 +0000 UTC m=+743.580917195" Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.805440 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4zxlq"] Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.805797 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" podUID="90f5c875-3507-453d-9901-fd60a1476f71" containerName="controller-manager" containerID="cri-o://00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6" gracePeriod=30 Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.906251 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b"] Oct 08 18:23:07 crc kubenswrapper[4750]: I1008 18:23:07.906536 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" podUID="a6530eb5-a257-4381-a176-b0e0972181ac" containerName="route-controller-manager" containerID="cri-o://22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4" gracePeriod=30 Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.249629 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.302741 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.370299 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-config\") pod \"90f5c875-3507-453d-9901-fd60a1476f71\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.370406 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f5c875-3507-453d-9901-fd60a1476f71-serving-cert\") pod \"90f5c875-3507-453d-9901-fd60a1476f71\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.370590 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-client-ca\") pod \"90f5c875-3507-453d-9901-fd60a1476f71\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.370635 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bzw6\" (UniqueName: \"kubernetes.io/projected/90f5c875-3507-453d-9901-fd60a1476f71-kube-api-access-5bzw6\") pod \"90f5c875-3507-453d-9901-fd60a1476f71\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.370699 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-proxy-ca-bundles\") pod \"90f5c875-3507-453d-9901-fd60a1476f71\" (UID: \"90f5c875-3507-453d-9901-fd60a1476f71\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.371310 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "90f5c875-3507-453d-9901-fd60a1476f71" (UID: "90f5c875-3507-453d-9901-fd60a1476f71"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.371372 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-config" (OuterVolumeSpecName: "config") pod "90f5c875-3507-453d-9901-fd60a1476f71" (UID: "90f5c875-3507-453d-9901-fd60a1476f71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.371444 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-client-ca" (OuterVolumeSpecName: "client-ca") pod "90f5c875-3507-453d-9901-fd60a1476f71" (UID: "90f5c875-3507-453d-9901-fd60a1476f71"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.377332 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f5c875-3507-453d-9901-fd60a1476f71-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "90f5c875-3507-453d-9901-fd60a1476f71" (UID: "90f5c875-3507-453d-9901-fd60a1476f71"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.381833 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f5c875-3507-453d-9901-fd60a1476f71-kube-api-access-5bzw6" (OuterVolumeSpecName: "kube-api-access-5bzw6") pod "90f5c875-3507-453d-9901-fd60a1476f71" (UID: "90f5c875-3507-453d-9901-fd60a1476f71"). InnerVolumeSpecName "kube-api-access-5bzw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.471872 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5fcm\" (UniqueName: \"kubernetes.io/projected/a6530eb5-a257-4381-a176-b0e0972181ac-kube-api-access-t5fcm\") pod \"a6530eb5-a257-4381-a176-b0e0972181ac\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.471988 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-client-ca\") pod \"a6530eb5-a257-4381-a176-b0e0972181ac\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.473181 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-client-ca" (OuterVolumeSpecName: "client-ca") pod "a6530eb5-a257-4381-a176-b0e0972181ac" (UID: "a6530eb5-a257-4381-a176-b0e0972181ac"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.473280 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6530eb5-a257-4381-a176-b0e0972181ac-serving-cert\") pod \"a6530eb5-a257-4381-a176-b0e0972181ac\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.473337 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-config\") pod \"a6530eb5-a257-4381-a176-b0e0972181ac\" (UID: \"a6530eb5-a257-4381-a176-b0e0972181ac\") " Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.474131 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-config" (OuterVolumeSpecName: "config") pod "a6530eb5-a257-4381-a176-b0e0972181ac" (UID: "a6530eb5-a257-4381-a176-b0e0972181ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.474388 4750 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.474405 4750 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.474418 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bzw6\" (UniqueName: \"kubernetes.io/projected/90f5c875-3507-453d-9901-fd60a1476f71-kube-api-access-5bzw6\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.474430 4750 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.474441 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6530eb5-a257-4381-a176-b0e0972181ac-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.474452 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f5c875-3507-453d-9901-fd60a1476f71-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.474463 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f5c875-3507-453d-9901-fd60a1476f71-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.476494 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6530eb5-a257-4381-a176-b0e0972181ac-kube-api-access-t5fcm" (OuterVolumeSpecName: "kube-api-access-t5fcm") pod "a6530eb5-a257-4381-a176-b0e0972181ac" (UID: "a6530eb5-a257-4381-a176-b0e0972181ac"). InnerVolumeSpecName "kube-api-access-t5fcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.476761 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6530eb5-a257-4381-a176-b0e0972181ac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a6530eb5-a257-4381-a176-b0e0972181ac" (UID: "a6530eb5-a257-4381-a176-b0e0972181ac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.576388 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6530eb5-a257-4381-a176-b0e0972181ac-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.576424 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5fcm\" (UniqueName: \"kubernetes.io/projected/a6530eb5-a257-4381-a176-b0e0972181ac-kube-api-access-t5fcm\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.614292 4750 generic.go:334] "Generic (PLEG): container finished" podID="a6530eb5-a257-4381-a176-b0e0972181ac" containerID="22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4" exitCode=0 Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.614374 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" event={"ID":"a6530eb5-a257-4381-a176-b0e0972181ac","Type":"ContainerDied","Data":"22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4"} Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.614390 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.614421 4750 scope.go:117] "RemoveContainer" containerID="22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.614407 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b" event={"ID":"a6530eb5-a257-4381-a176-b0e0972181ac","Type":"ContainerDied","Data":"188d58463632b1247c6c47ced7a84c2e795d9ae56a4a2dbd2c26460b1c38cb05"} Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.622489 4750 generic.go:334] "Generic (PLEG): container finished" podID="90f5c875-3507-453d-9901-fd60a1476f71" containerID="00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6" exitCode=0 Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.623277 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.624599 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" event={"ID":"90f5c875-3507-453d-9901-fd60a1476f71","Type":"ContainerDied","Data":"00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6"} Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.624738 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4zxlq" event={"ID":"90f5c875-3507-453d-9901-fd60a1476f71","Type":"ContainerDied","Data":"37d3972a4ef95ee2f606e183a67b5b6aece3fad52bef027f385080c76eb38284"} Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.658409 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b"] Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.675887 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-69h7b"] Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.681965 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4zxlq"] Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.689236 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4zxlq"] Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.741503 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f5c875-3507-453d-9901-fd60a1476f71" path="/var/lib/kubelet/pods/90f5c875-3507-453d-9901-fd60a1476f71/volumes" Oct 08 18:23:08 crc kubenswrapper[4750]: I1008 18:23:08.742209 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6530eb5-a257-4381-a176-b0e0972181ac" path="/var/lib/kubelet/pods/a6530eb5-a257-4381-a176-b0e0972181ac/volumes" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.106891 4750 scope.go:117] "RemoveContainer" containerID="22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4" Oct 08 18:23:09 crc kubenswrapper[4750]: E1008 18:23:09.109188 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4\": container with ID starting with 22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4 not found: ID does not exist" containerID="22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.109241 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4"} err="failed to get container status \"22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4\": rpc error: code = NotFound desc = could not find container \"22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4\": container with ID starting with 22f7ad3ba4a1396ba8fd55914dcd52bcdfda04d24e6ac10c58e2e09902536da4 not found: ID does not exist" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.109284 4750 scope.go:117] "RemoveContainer" containerID="00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.170635 4750 scope.go:117] "RemoveContainer" containerID="00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6" Oct 08 18:23:09 crc kubenswrapper[4750]: E1008 18:23:09.171123 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6\": container with ID starting with 00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6 not found: ID does not exist" containerID="00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.171152 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6"} err="failed to get container status \"00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6\": rpc error: code = NotFound desc = could not find container \"00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6\": container with ID starting with 00781f350fb9e6c881c670fa305232fcc6f4f516dd647ac20adadf17d821a8a6 not found: ID does not exist" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.311952 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-857fbb49f9-njdgx"] Oct 08 18:23:09 crc kubenswrapper[4750]: E1008 18:23:09.312358 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6530eb5-a257-4381-a176-b0e0972181ac" containerName="route-controller-manager" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.312375 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6530eb5-a257-4381-a176-b0e0972181ac" containerName="route-controller-manager" Oct 08 18:23:09 crc kubenswrapper[4750]: E1008 18:23:09.312392 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f5c875-3507-453d-9901-fd60a1476f71" containerName="controller-manager" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.312398 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f5c875-3507-453d-9901-fd60a1476f71" containerName="controller-manager" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.312529 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f5c875-3507-453d-9901-fd60a1476f71" containerName="controller-manager" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.312541 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6530eb5-a257-4381-a176-b0e0972181ac" containerName="route-controller-manager" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.315008 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.317884 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.318082 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.318269 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.318826 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.318950 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4"] Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.319006 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.319224 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.320000 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.322624 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.322628 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.322710 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.323255 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.323364 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.323458 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-857fbb49f9-njdgx"] Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.324803 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.327748 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4"] Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.329270 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390371 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-config\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390439 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732c1ced-a511-45fc-ad45-5f7e657cb837-serving-cert\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390534 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-config\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390580 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-client-ca\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390656 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-client-ca\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390696 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skcfp\" (UniqueName: \"kubernetes.io/projected/4cab4f02-a4c4-40f7-805c-f050988162c4-kube-api-access-skcfp\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390738 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-proxy-ca-bundles\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390785 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cab4f02-a4c4-40f7-805c-f050988162c4-serving-cert\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.390856 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmhgg\" (UniqueName: \"kubernetes.io/projected/732c1ced-a511-45fc-ad45-5f7e657cb837-kube-api-access-zmhgg\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.493950 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmhgg\" (UniqueName: \"kubernetes.io/projected/732c1ced-a511-45fc-ad45-5f7e657cb837-kube-api-access-zmhgg\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.494019 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-config\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.494054 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732c1ced-a511-45fc-ad45-5f7e657cb837-serving-cert\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.494091 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-config\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.494112 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-client-ca\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.494140 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-client-ca\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.494161 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skcfp\" (UniqueName: \"kubernetes.io/projected/4cab4f02-a4c4-40f7-805c-f050988162c4-kube-api-access-skcfp\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.494190 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-proxy-ca-bundles\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.494223 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cab4f02-a4c4-40f7-805c-f050988162c4-serving-cert\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.497122 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-client-ca\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.499120 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-client-ca\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.505053 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-config\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.506816 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-proxy-ca-bundles\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.506830 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cab4f02-a4c4-40f7-805c-f050988162c4-serving-cert\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.507688 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732c1ced-a511-45fc-ad45-5f7e657cb837-config\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.508581 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/732c1ced-a511-45fc-ad45-5f7e657cb837-serving-cert\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.516674 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skcfp\" (UniqueName: \"kubernetes.io/projected/4cab4f02-a4c4-40f7-805c-f050988162c4-kube-api-access-skcfp\") pod \"route-controller-manager-645b494dc4-s6hl4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.525076 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmhgg\" (UniqueName: \"kubernetes.io/projected/732c1ced-a511-45fc-ad45-5f7e657cb837-kube-api-access-zmhgg\") pod \"controller-manager-857fbb49f9-njdgx\" (UID: \"732c1ced-a511-45fc-ad45-5f7e657cb837\") " pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.629219 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" event={"ID":"eb4084db-bdd2-4c22-9eaa-0919b73e0977","Type":"ContainerStarted","Data":"3751481bdcbf8df53726dab5930a3ff00c932edcfab672fb6f8093939fddd088"} Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.646816 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8j7zl" podStartSLOduration=1.571368828 podStartE2EDuration="6.646799841s" podCreationTimestamp="2025-10-08 18:23:03 +0000 UTC" firstStartedPulling="2025-10-08 18:23:04.115809727 +0000 UTC m=+740.028780740" lastFinishedPulling="2025-10-08 18:23:09.19124074 +0000 UTC m=+745.104211753" observedRunningTime="2025-10-08 18:23:09.645692522 +0000 UTC m=+745.558663545" watchObservedRunningTime="2025-10-08 18:23:09.646799841 +0000 UTC m=+745.559770874" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.683925 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.692328 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.807276 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4"] Oct 08 18:23:09 crc kubenswrapper[4750]: I1008 18:23:09.949354 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4"] Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.004958 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-857fbb49f9-njdgx"] Oct 08 18:23:10 crc kubenswrapper[4750]: W1008 18:23:10.009836 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod732c1ced_a511_45fc_ad45_5f7e657cb837.slice/crio-685c1d9fbeee9494696c167e293fcfb3b0c86ec9a327cbb36d3a1414784b5cf0 WatchSource:0}: Error finding container 685c1d9fbeee9494696c167e293fcfb3b0c86ec9a327cbb36d3a1414784b5cf0: Status 404 returned error can't find the container with id 685c1d9fbeee9494696c167e293fcfb3b0c86ec9a327cbb36d3a1414784b5cf0 Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.641794 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" event={"ID":"732c1ced-a511-45fc-ad45-5f7e657cb837","Type":"ContainerStarted","Data":"f7ca413084868031e83021da4ea107701e0ab3b7b7b266becbf74c80317b9786"} Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.642227 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" event={"ID":"732c1ced-a511-45fc-ad45-5f7e657cb837","Type":"ContainerStarted","Data":"685c1d9fbeee9494696c167e293fcfb3b0c86ec9a327cbb36d3a1414784b5cf0"} Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.644115 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.647503 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" podUID="4cab4f02-a4c4-40f7-805c-f050988162c4" containerName="route-controller-manager" containerID="cri-o://2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe" gracePeriod=30 Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.647779 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" event={"ID":"4cab4f02-a4c4-40f7-805c-f050988162c4","Type":"ContainerStarted","Data":"2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe"} Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.647804 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" event={"ID":"4cab4f02-a4c4-40f7-805c-f050988162c4","Type":"ContainerStarted","Data":"3a5f77cd411352c2995933294c9882a3b98d448c136ca2e6c4c84ec30bda9877"} Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.648439 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.648872 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.651748 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.676191 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" podStartSLOduration=3.676157289 podStartE2EDuration="3.676157289s" podCreationTimestamp="2025-10-08 18:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:23:10.674763713 +0000 UTC m=+746.587734726" watchObservedRunningTime="2025-10-08 18:23:10.676157289 +0000 UTC m=+746.589128302" Oct 08 18:23:10 crc kubenswrapper[4750]: I1008 18:23:10.676547 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-857fbb49f9-njdgx" podStartSLOduration=3.676540369 podStartE2EDuration="3.676540369s" podCreationTimestamp="2025-10-08 18:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:23:10.659900331 +0000 UTC m=+746.572871344" watchObservedRunningTime="2025-10-08 18:23:10.676540369 +0000 UTC m=+746.589511382" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.205864 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.239783 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq"] Oct 08 18:23:11 crc kubenswrapper[4750]: E1008 18:23:11.243064 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cab4f02-a4c4-40f7-805c-f050988162c4" containerName="route-controller-manager" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.243457 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cab4f02-a4c4-40f7-805c-f050988162c4" containerName="route-controller-manager" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.243951 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cab4f02-a4c4-40f7-805c-f050988162c4" containerName="route-controller-manager" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.245070 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.246882 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq"] Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.319906 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skcfp\" (UniqueName: \"kubernetes.io/projected/4cab4f02-a4c4-40f7-805c-f050988162c4-kube-api-access-skcfp\") pod \"4cab4f02-a4c4-40f7-805c-f050988162c4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.319965 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-client-ca\") pod \"4cab4f02-a4c4-40f7-805c-f050988162c4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.320011 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cab4f02-a4c4-40f7-805c-f050988162c4-serving-cert\") pod \"4cab4f02-a4c4-40f7-805c-f050988162c4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.320220 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-config\") pod \"4cab4f02-a4c4-40f7-805c-f050988162c4\" (UID: \"4cab4f02-a4c4-40f7-805c-f050988162c4\") " Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.320725 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a26b160-1c37-4b45-b185-4968e659cccb-serving-cert\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.320790 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a26b160-1c37-4b45-b185-4968e659cccb-config\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.320826 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt957\" (UniqueName: \"kubernetes.io/projected/4a26b160-1c37-4b45-b185-4968e659cccb-kube-api-access-zt957\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.320861 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a26b160-1c37-4b45-b185-4968e659cccb-client-ca\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.322087 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "4cab4f02-a4c4-40f7-805c-f050988162c4" (UID: "4cab4f02-a4c4-40f7-805c-f050988162c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.322395 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-config" (OuterVolumeSpecName: "config") pod "4cab4f02-a4c4-40f7-805c-f050988162c4" (UID: "4cab4f02-a4c4-40f7-805c-f050988162c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.325545 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cab4f02-a4c4-40f7-805c-f050988162c4-kube-api-access-skcfp" (OuterVolumeSpecName: "kube-api-access-skcfp") pod "4cab4f02-a4c4-40f7-805c-f050988162c4" (UID: "4cab4f02-a4c4-40f7-805c-f050988162c4"). InnerVolumeSpecName "kube-api-access-skcfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.325630 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cab4f02-a4c4-40f7-805c-f050988162c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4cab4f02-a4c4-40f7-805c-f050988162c4" (UID: "4cab4f02-a4c4-40f7-805c-f050988162c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.421682 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a26b160-1c37-4b45-b185-4968e659cccb-serving-cert\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.421740 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a26b160-1c37-4b45-b185-4968e659cccb-config\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.421778 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt957\" (UniqueName: \"kubernetes.io/projected/4a26b160-1c37-4b45-b185-4968e659cccb-kube-api-access-zt957\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.421805 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a26b160-1c37-4b45-b185-4968e659cccb-client-ca\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.421838 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skcfp\" (UniqueName: \"kubernetes.io/projected/4cab4f02-a4c4-40f7-805c-f050988162c4-kube-api-access-skcfp\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.421848 4750 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.421858 4750 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cab4f02-a4c4-40f7-805c-f050988162c4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.421867 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cab4f02-a4c4-40f7-805c-f050988162c4-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.423014 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a26b160-1c37-4b45-b185-4968e659cccb-client-ca\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.424278 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a26b160-1c37-4b45-b185-4968e659cccb-config\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.425466 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a26b160-1c37-4b45-b185-4968e659cccb-serving-cert\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.440321 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt957\" (UniqueName: \"kubernetes.io/projected/4a26b160-1c37-4b45-b185-4968e659cccb-kube-api-access-zt957\") pod \"route-controller-manager-8684d5f8f9-qlnlq\" (UID: \"4a26b160-1c37-4b45-b185-4968e659cccb\") " pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.561870 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.661869 4750 generic.go:334] "Generic (PLEG): container finished" podID="4cab4f02-a4c4-40f7-805c-f050988162c4" containerID="2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe" exitCode=0 Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.662273 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.662189 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" event={"ID":"4cab4f02-a4c4-40f7-805c-f050988162c4","Type":"ContainerDied","Data":"2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe"} Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.662342 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4" event={"ID":"4cab4f02-a4c4-40f7-805c-f050988162c4","Type":"ContainerDied","Data":"3a5f77cd411352c2995933294c9882a3b98d448c136ca2e6c4c84ec30bda9877"} Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.662503 4750 scope.go:117] "RemoveContainer" containerID="2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.682352 4750 scope.go:117] "RemoveContainer" containerID="2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe" Oct 08 18:23:11 crc kubenswrapper[4750]: E1008 18:23:11.683508 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe\": container with ID starting with 2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe not found: ID does not exist" containerID="2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.683571 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe"} err="failed to get container status \"2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe\": rpc error: code = NotFound desc = could not find container \"2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe\": container with ID starting with 2acd7039166a9ac27556bdc996fc38c64d8a6989decb082eda0ac5729ea879fe not found: ID does not exist" Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.693689 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4"] Oct 08 18:23:11 crc kubenswrapper[4750]: I1008 18:23:11.693734 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-645b494dc4-s6hl4"] Oct 08 18:23:12 crc kubenswrapper[4750]: I1008 18:23:12.006851 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq"] Oct 08 18:23:12 crc kubenswrapper[4750]: W1008 18:23:12.016395 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a26b160_1c37_4b45_b185_4968e659cccb.slice/crio-4e69ad2901cd886f154f1d97cfd9a39275453b5a9ff6f264db639872752035ac WatchSource:0}: Error finding container 4e69ad2901cd886f154f1d97cfd9a39275453b5a9ff6f264db639872752035ac: Status 404 returned error can't find the container with id 4e69ad2901cd886f154f1d97cfd9a39275453b5a9ff6f264db639872752035ac Oct 08 18:23:12 crc kubenswrapper[4750]: I1008 18:23:12.669094 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" event={"ID":"4a26b160-1c37-4b45-b185-4968e659cccb","Type":"ContainerStarted","Data":"e97540314908a4a296722e60105e4df6e55d752bc080aa06af539aba2be3971d"} Oct 08 18:23:12 crc kubenswrapper[4750]: I1008 18:23:12.669446 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" event={"ID":"4a26b160-1c37-4b45-b185-4968e659cccb","Type":"ContainerStarted","Data":"4e69ad2901cd886f154f1d97cfd9a39275453b5a9ff6f264db639872752035ac"} Oct 08 18:23:12 crc kubenswrapper[4750]: I1008 18:23:12.669469 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:12 crc kubenswrapper[4750]: I1008 18:23:12.673971 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" Oct 08 18:23:12 crc kubenswrapper[4750]: I1008 18:23:12.688310 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8684d5f8f9-qlnlq" podStartSLOduration=3.6882885549999997 podStartE2EDuration="3.688288555s" podCreationTimestamp="2025-10-08 18:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:23:12.684822407 +0000 UTC m=+748.597793430" watchObservedRunningTime="2025-10-08 18:23:12.688288555 +0000 UTC m=+748.601259578" Oct 08 18:23:12 crc kubenswrapper[4750]: I1008 18:23:12.743108 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cab4f02-a4c4-40f7-805c-f050988162c4" path="/var/lib/kubelet/pods/4cab4f02-a4c4-40f7-805c-f050988162c4/volumes" Oct 08 18:23:13 crc kubenswrapper[4750]: I1008 18:23:13.883184 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zgswq" Oct 08 18:23:14 crc kubenswrapper[4750]: I1008 18:23:14.222184 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:14 crc kubenswrapper[4750]: I1008 18:23:14.222244 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:14 crc kubenswrapper[4750]: I1008 18:23:14.227124 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:14 crc kubenswrapper[4750]: I1008 18:23:14.704564 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-766b49dfb7-zdlzb" Oct 08 18:23:14 crc kubenswrapper[4750]: I1008 18:23:14.763435 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wprnh"] Oct 08 18:23:18 crc kubenswrapper[4750]: I1008 18:23:18.033635 4750 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 18:23:24 crc kubenswrapper[4750]: I1008 18:23:24.435463 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pzxbn" Oct 08 18:23:29 crc kubenswrapper[4750]: I1008 18:23:29.706872 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:23:29 crc kubenswrapper[4750]: I1008 18:23:29.707827 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:23:29 crc kubenswrapper[4750]: I1008 18:23:29.707901 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:23:29 crc kubenswrapper[4750]: I1008 18:23:29.708515 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4aa9c398c94477f10b8d76ef065deabe11c48fe9c89856d25a3a57b78914e105"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:23:29 crc kubenswrapper[4750]: I1008 18:23:29.708603 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://4aa9c398c94477f10b8d76ef065deabe11c48fe9c89856d25a3a57b78914e105" gracePeriod=600 Oct 08 18:23:30 crc kubenswrapper[4750]: I1008 18:23:30.780180 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="4aa9c398c94477f10b8d76ef065deabe11c48fe9c89856d25a3a57b78914e105" exitCode=0 Oct 08 18:23:30 crc kubenswrapper[4750]: I1008 18:23:30.780238 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"4aa9c398c94477f10b8d76ef065deabe11c48fe9c89856d25a3a57b78914e105"} Oct 08 18:23:30 crc kubenswrapper[4750]: I1008 18:23:30.780753 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"dee2d35a9b3eba166b103d6a720e7cbf72b0876a67fbdc37629a8900d4d09d57"} Oct 08 18:23:30 crc kubenswrapper[4750]: I1008 18:23:30.780773 4750 scope.go:117] "RemoveContainer" containerID="2b873e94d0aa696b6eab6a4a655b8fb4893434625aed946509deb5ffaef26cc6" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.647948 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94"] Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.650270 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.652470 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.662149 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94"] Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.769442 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt4nc\" (UniqueName: \"kubernetes.io/projected/99f391ca-4dc5-402b-ab7c-916433ee0b9e-kube-api-access-jt4nc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.769515 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.769565 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.871261 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.871345 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.871406 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt4nc\" (UniqueName: \"kubernetes.io/projected/99f391ca-4dc5-402b-ab7c-916433ee0b9e-kube-api-access-jt4nc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.871765 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.871918 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.896881 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt4nc\" (UniqueName: \"kubernetes.io/projected/99f391ca-4dc5-402b-ab7c-916433ee0b9e-kube-api-access-jt4nc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:36 crc kubenswrapper[4750]: I1008 18:23:36.971272 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:37 crc kubenswrapper[4750]: I1008 18:23:37.398447 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94"] Oct 08 18:23:37 crc kubenswrapper[4750]: I1008 18:23:37.828962 4750 generic.go:334] "Generic (PLEG): container finished" podID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerID="6ca2940093291b9e4c964c2419a835f69b6bb7da43ee30416f4148886f41ff34" exitCode=0 Oct 08 18:23:37 crc kubenswrapper[4750]: I1008 18:23:37.829013 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" event={"ID":"99f391ca-4dc5-402b-ab7c-916433ee0b9e","Type":"ContainerDied","Data":"6ca2940093291b9e4c964c2419a835f69b6bb7da43ee30416f4148886f41ff34"} Oct 08 18:23:37 crc kubenswrapper[4750]: I1008 18:23:37.829053 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" event={"ID":"99f391ca-4dc5-402b-ab7c-916433ee0b9e","Type":"ContainerStarted","Data":"f2703a68809d141bb41a19da0ecb6c5cb34470c38d5a8abd15028992aa822287"} Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.007670 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q8hbp"] Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.009317 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.023005 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q8hbp"] Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.102986 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-utilities\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.103044 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-catalog-content\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.103097 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2wxx\" (UniqueName: \"kubernetes.io/projected/52bce191-16d3-43cc-929f-dde01687a4cd-kube-api-access-x2wxx\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.204576 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2wxx\" (UniqueName: \"kubernetes.io/projected/52bce191-16d3-43cc-929f-dde01687a4cd-kube-api-access-x2wxx\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.204644 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-utilities\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.204673 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-catalog-content\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.205203 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-catalog-content\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.205206 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-utilities\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.224416 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2wxx\" (UniqueName: \"kubernetes.io/projected/52bce191-16d3-43cc-929f-dde01687a4cd-kube-api-access-x2wxx\") pod \"redhat-operators-q8hbp\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.342628 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.759402 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q8hbp"] Oct 08 18:23:39 crc kubenswrapper[4750]: W1008 18:23:39.769203 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52bce191_16d3_43cc_929f_dde01687a4cd.slice/crio-d4dbc606560dc5380c6092310d93f587bb180aea8caa419dbc426dd7323c7604 WatchSource:0}: Error finding container d4dbc606560dc5380c6092310d93f587bb180aea8caa419dbc426dd7323c7604: Status 404 returned error can't find the container with id d4dbc606560dc5380c6092310d93f587bb180aea8caa419dbc426dd7323c7604 Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.806132 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wprnh" podUID="7c3552dc-a0cf-4072-91e1-030803f6014d" containerName="console" containerID="cri-o://1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef" gracePeriod=15 Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.840408 4750 generic.go:334] "Generic (PLEG): container finished" podID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerID="71da952c7bdbd82d6358bc6c69761b7911895886074172fcc7b16abf11770942" exitCode=0 Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.840484 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" event={"ID":"99f391ca-4dc5-402b-ab7c-916433ee0b9e","Type":"ContainerDied","Data":"71da952c7bdbd82d6358bc6c69761b7911895886074172fcc7b16abf11770942"} Oct 08 18:23:39 crc kubenswrapper[4750]: I1008 18:23:39.841482 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8hbp" event={"ID":"52bce191-16d3-43cc-929f-dde01687a4cd","Type":"ContainerStarted","Data":"d4dbc606560dc5380c6092310d93f587bb180aea8caa419dbc426dd7323c7604"} Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.269083 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wprnh_7c3552dc-a0cf-4072-91e1-030803f6014d/console/0.log" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.269404 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.316142 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-serving-cert\") pod \"7c3552dc-a0cf-4072-91e1-030803f6014d\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.316217 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcm5r\" (UniqueName: \"kubernetes.io/projected/7c3552dc-a0cf-4072-91e1-030803f6014d-kube-api-access-tcm5r\") pod \"7c3552dc-a0cf-4072-91e1-030803f6014d\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.316282 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-trusted-ca-bundle\") pod \"7c3552dc-a0cf-4072-91e1-030803f6014d\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.316311 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-oauth-serving-cert\") pod \"7c3552dc-a0cf-4072-91e1-030803f6014d\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.316343 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-service-ca\") pod \"7c3552dc-a0cf-4072-91e1-030803f6014d\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.316465 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-console-config\") pod \"7c3552dc-a0cf-4072-91e1-030803f6014d\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.316491 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-oauth-config\") pod \"7c3552dc-a0cf-4072-91e1-030803f6014d\" (UID: \"7c3552dc-a0cf-4072-91e1-030803f6014d\") " Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.317709 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-service-ca" (OuterVolumeSpecName: "service-ca") pod "7c3552dc-a0cf-4072-91e1-030803f6014d" (UID: "7c3552dc-a0cf-4072-91e1-030803f6014d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.317733 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7c3552dc-a0cf-4072-91e1-030803f6014d" (UID: "7c3552dc-a0cf-4072-91e1-030803f6014d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.317908 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7c3552dc-a0cf-4072-91e1-030803f6014d" (UID: "7c3552dc-a0cf-4072-91e1-030803f6014d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.318103 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-console-config" (OuterVolumeSpecName: "console-config") pod "7c3552dc-a0cf-4072-91e1-030803f6014d" (UID: "7c3552dc-a0cf-4072-91e1-030803f6014d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.322396 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7c3552dc-a0cf-4072-91e1-030803f6014d" (UID: "7c3552dc-a0cf-4072-91e1-030803f6014d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.322506 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3552dc-a0cf-4072-91e1-030803f6014d-kube-api-access-tcm5r" (OuterVolumeSpecName: "kube-api-access-tcm5r") pod "7c3552dc-a0cf-4072-91e1-030803f6014d" (UID: "7c3552dc-a0cf-4072-91e1-030803f6014d"). InnerVolumeSpecName "kube-api-access-tcm5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.322797 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7c3552dc-a0cf-4072-91e1-030803f6014d" (UID: "7c3552dc-a0cf-4072-91e1-030803f6014d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.418372 4750 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.418452 4750 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.418471 4750 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.418487 4750 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c3552dc-a0cf-4072-91e1-030803f6014d-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.418506 4750 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.418524 4750 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3552dc-a0cf-4072-91e1-030803f6014d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.418617 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcm5r\" (UniqueName: \"kubernetes.io/projected/7c3552dc-a0cf-4072-91e1-030803f6014d-kube-api-access-tcm5r\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.846899 4750 generic.go:334] "Generic (PLEG): container finished" podID="52bce191-16d3-43cc-929f-dde01687a4cd" containerID="007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605" exitCode=0 Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.846963 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8hbp" event={"ID":"52bce191-16d3-43cc-929f-dde01687a4cd","Type":"ContainerDied","Data":"007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605"} Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.851756 4750 generic.go:334] "Generic (PLEG): container finished" podID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerID="2795e5e8043c07214f232177bd29723ad8f6fe1621076d2e8e86c8c68ca6bece" exitCode=0 Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.851815 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" event={"ID":"99f391ca-4dc5-402b-ab7c-916433ee0b9e","Type":"ContainerDied","Data":"2795e5e8043c07214f232177bd29723ad8f6fe1621076d2e8e86c8c68ca6bece"} Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.853181 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wprnh_7c3552dc-a0cf-4072-91e1-030803f6014d/console/0.log" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.853234 4750 generic.go:334] "Generic (PLEG): container finished" podID="7c3552dc-a0cf-4072-91e1-030803f6014d" containerID="1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef" exitCode=2 Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.853264 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wprnh" event={"ID":"7c3552dc-a0cf-4072-91e1-030803f6014d","Type":"ContainerDied","Data":"1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef"} Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.853294 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wprnh" event={"ID":"7c3552dc-a0cf-4072-91e1-030803f6014d","Type":"ContainerDied","Data":"bd034864bd4459958435f81400ed7f11faeb47aae5ce740cd76749145fac5d1c"} Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.853266 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wprnh" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.853314 4750 scope.go:117] "RemoveContainer" containerID="1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.873384 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wprnh"] Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.876327 4750 scope.go:117] "RemoveContainer" containerID="1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef" Oct 08 18:23:40 crc kubenswrapper[4750]: E1008 18:23:40.876775 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef\": container with ID starting with 1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef not found: ID does not exist" containerID="1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.876806 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef"} err="failed to get container status \"1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef\": rpc error: code = NotFound desc = could not find container \"1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef\": container with ID starting with 1e7135e57dec73c13124fe8beeb48ff648f3b79424fb4b45bbb1a54dc2efe3ef not found: ID does not exist" Oct 08 18:23:40 crc kubenswrapper[4750]: I1008 18:23:40.877386 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wprnh"] Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.448047 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.544319 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt4nc\" (UniqueName: \"kubernetes.io/projected/99f391ca-4dc5-402b-ab7c-916433ee0b9e-kube-api-access-jt4nc\") pod \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.544523 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-bundle\") pod \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.544601 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-util\") pod \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\" (UID: \"99f391ca-4dc5-402b-ab7c-916433ee0b9e\") " Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.545823 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-bundle" (OuterVolumeSpecName: "bundle") pod "99f391ca-4dc5-402b-ab7c-916433ee0b9e" (UID: "99f391ca-4dc5-402b-ab7c-916433ee0b9e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.551575 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f391ca-4dc5-402b-ab7c-916433ee0b9e-kube-api-access-jt4nc" (OuterVolumeSpecName: "kube-api-access-jt4nc") pod "99f391ca-4dc5-402b-ab7c-916433ee0b9e" (UID: "99f391ca-4dc5-402b-ab7c-916433ee0b9e"). InnerVolumeSpecName "kube-api-access-jt4nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.576030 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-util" (OuterVolumeSpecName: "util") pod "99f391ca-4dc5-402b-ab7c-916433ee0b9e" (UID: "99f391ca-4dc5-402b-ab7c-916433ee0b9e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.646475 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt4nc\" (UniqueName: \"kubernetes.io/projected/99f391ca-4dc5-402b-ab7c-916433ee0b9e-kube-api-access-jt4nc\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.646510 4750 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.646522 4750 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f391ca-4dc5-402b-ab7c-916433ee0b9e-util\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.748045 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3552dc-a0cf-4072-91e1-030803f6014d" path="/var/lib/kubelet/pods/7c3552dc-a0cf-4072-91e1-030803f6014d/volumes" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.869112 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.869109 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94" event={"ID":"99f391ca-4dc5-402b-ab7c-916433ee0b9e","Type":"ContainerDied","Data":"f2703a68809d141bb41a19da0ecb6c5cb34470c38d5a8abd15028992aa822287"} Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.869255 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2703a68809d141bb41a19da0ecb6c5cb34470c38d5a8abd15028992aa822287" Oct 08 18:23:42 crc kubenswrapper[4750]: I1008 18:23:42.872706 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8hbp" event={"ID":"52bce191-16d3-43cc-929f-dde01687a4cd","Type":"ContainerStarted","Data":"da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316"} Oct 08 18:23:43 crc kubenswrapper[4750]: I1008 18:23:43.881492 4750 generic.go:334] "Generic (PLEG): container finished" podID="52bce191-16d3-43cc-929f-dde01687a4cd" containerID="da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316" exitCode=0 Oct 08 18:23:43 crc kubenswrapper[4750]: I1008 18:23:43.881707 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8hbp" event={"ID":"52bce191-16d3-43cc-929f-dde01687a4cd","Type":"ContainerDied","Data":"da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316"} Oct 08 18:23:44 crc kubenswrapper[4750]: I1008 18:23:44.889061 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8hbp" event={"ID":"52bce191-16d3-43cc-929f-dde01687a4cd","Type":"ContainerStarted","Data":"2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9"} Oct 08 18:23:44 crc kubenswrapper[4750]: I1008 18:23:44.906908 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q8hbp" podStartSLOduration=3.42516181 podStartE2EDuration="6.906893512s" podCreationTimestamp="2025-10-08 18:23:38 +0000 UTC" firstStartedPulling="2025-10-08 18:23:40.848252333 +0000 UTC m=+776.761223346" lastFinishedPulling="2025-10-08 18:23:44.329983995 +0000 UTC m=+780.242955048" observedRunningTime="2025-10-08 18:23:44.906054521 +0000 UTC m=+780.819025534" watchObservedRunningTime="2025-10-08 18:23:44.906893512 +0000 UTC m=+780.819864525" Oct 08 18:23:49 crc kubenswrapper[4750]: I1008 18:23:49.343560 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:49 crc kubenswrapper[4750]: I1008 18:23:49.344047 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:49 crc kubenswrapper[4750]: I1008 18:23:49.388618 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:49 crc kubenswrapper[4750]: I1008 18:23:49.965932 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:50 crc kubenswrapper[4750]: I1008 18:23:50.796247 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q8hbp"] Oct 08 18:23:51 crc kubenswrapper[4750]: I1008 18:23:51.928838 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q8hbp" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" containerName="registry-server" containerID="cri-o://2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9" gracePeriod=2 Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.780925 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk"] Oct 08 18:23:52 crc kubenswrapper[4750]: E1008 18:23:52.782435 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerName="extract" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.782454 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerName="extract" Oct 08 18:23:52 crc kubenswrapper[4750]: E1008 18:23:52.782485 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerName="util" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.782492 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerName="util" Oct 08 18:23:52 crc kubenswrapper[4750]: E1008 18:23:52.782510 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerName="pull" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.782516 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerName="pull" Oct 08 18:23:52 crc kubenswrapper[4750]: E1008 18:23:52.782523 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3552dc-a0cf-4072-91e1-030803f6014d" containerName="console" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.782529 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3552dc-a0cf-4072-91e1-030803f6014d" containerName="console" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.782724 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3552dc-a0cf-4072-91e1-030803f6014d" containerName="console" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.782751 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f391ca-4dc5-402b-ab7c-916433ee0b9e" containerName="extract" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.786412 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.800843 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.800927 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.801117 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-pptzf" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.802181 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.802379 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.811155 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk"] Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.870786 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b578cc7-f656-4f3c-ae80-a54d325a597e-apiservice-cert\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.870878 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b578cc7-f656-4f3c-ae80-a54d325a597e-webhook-cert\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.870918 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmkqx\" (UniqueName: \"kubernetes.io/projected/9b578cc7-f656-4f3c-ae80-a54d325a597e-kube-api-access-pmkqx\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.910971 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.935562 4750 generic.go:334] "Generic (PLEG): container finished" podID="52bce191-16d3-43cc-929f-dde01687a4cd" containerID="2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9" exitCode=0 Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.935616 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8hbp" event={"ID":"52bce191-16d3-43cc-929f-dde01687a4cd","Type":"ContainerDied","Data":"2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9"} Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.935657 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8hbp" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.935679 4750 scope.go:117] "RemoveContainer" containerID="2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.935663 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8hbp" event={"ID":"52bce191-16d3-43cc-929f-dde01687a4cd","Type":"ContainerDied","Data":"d4dbc606560dc5380c6092310d93f587bb180aea8caa419dbc426dd7323c7604"} Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.950686 4750 scope.go:117] "RemoveContainer" containerID="da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.967842 4750 scope.go:117] "RemoveContainer" containerID="007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.972192 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-utilities\") pod \"52bce191-16d3-43cc-929f-dde01687a4cd\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.972343 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2wxx\" (UniqueName: \"kubernetes.io/projected/52bce191-16d3-43cc-929f-dde01687a4cd-kube-api-access-x2wxx\") pod \"52bce191-16d3-43cc-929f-dde01687a4cd\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.972424 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-catalog-content\") pod \"52bce191-16d3-43cc-929f-dde01687a4cd\" (UID: \"52bce191-16d3-43cc-929f-dde01687a4cd\") " Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.977706 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bce191-16d3-43cc-929f-dde01687a4cd-kube-api-access-x2wxx" (OuterVolumeSpecName: "kube-api-access-x2wxx") pod "52bce191-16d3-43cc-929f-dde01687a4cd" (UID: "52bce191-16d3-43cc-929f-dde01687a4cd"). InnerVolumeSpecName "kube-api-access-x2wxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.982389 4750 scope.go:117] "RemoveContainer" containerID="2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.982624 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-utilities" (OuterVolumeSpecName: "utilities") pod "52bce191-16d3-43cc-929f-dde01687a4cd" (UID: "52bce191-16d3-43cc-929f-dde01687a4cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.991061 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmkqx\" (UniqueName: \"kubernetes.io/projected/9b578cc7-f656-4f3c-ae80-a54d325a597e-kube-api-access-pmkqx\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.991279 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b578cc7-f656-4f3c-ae80-a54d325a597e-apiservice-cert\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.991385 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b578cc7-f656-4f3c-ae80-a54d325a597e-webhook-cert\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:52 crc kubenswrapper[4750]: I1008 18:23:52.991452 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2wxx\" (UniqueName: \"kubernetes.io/projected/52bce191-16d3-43cc-929f-dde01687a4cd-kube-api-access-x2wxx\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:53 crc kubenswrapper[4750]: E1008 18:23:53.001327 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9\": container with ID starting with 2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9 not found: ID does not exist" containerID="2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.001382 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9"} err="failed to get container status \"2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9\": rpc error: code = NotFound desc = could not find container \"2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9\": container with ID starting with 2cdc2a2abcfb5717eee4984b4bdc77ab7529185f35880fcdb27b91d750dd43a9 not found: ID does not exist" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.001414 4750 scope.go:117] "RemoveContainer" containerID="da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.002375 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b578cc7-f656-4f3c-ae80-a54d325a597e-webhook-cert\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.004520 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b578cc7-f656-4f3c-ae80-a54d325a597e-apiservice-cert\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:53 crc kubenswrapper[4750]: E1008 18:23:53.004976 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316\": container with ID starting with da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316 not found: ID does not exist" containerID="da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.005036 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316"} err="failed to get container status \"da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316\": rpc error: code = NotFound desc = could not find container \"da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316\": container with ID starting with da5691cd2a5ff7dbd9ae39b81d2ddfb45cb4335ab6df71a1c232bf616f7f5316 not found: ID does not exist" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.005072 4750 scope.go:117] "RemoveContainer" containerID="007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605" Oct 08 18:23:53 crc kubenswrapper[4750]: E1008 18:23:53.005536 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605\": container with ID starting with 007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605 not found: ID does not exist" containerID="007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.005659 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605"} err="failed to get container status \"007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605\": rpc error: code = NotFound desc = could not find container \"007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605\": container with ID starting with 007a522c2d1ab3750a5a108fd5f49f9df0088dbc305d628eea0397a72afde605 not found: ID does not exist" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.019158 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmkqx\" (UniqueName: \"kubernetes.io/projected/9b578cc7-f656-4f3c-ae80-a54d325a597e-kube-api-access-pmkqx\") pod \"metallb-operator-controller-manager-7767688d85-nqdwk\" (UID: \"9b578cc7-f656-4f3c-ae80-a54d325a597e\") " pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.041439 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29"] Oct 08 18:23:53 crc kubenswrapper[4750]: E1008 18:23:53.044921 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" containerName="extract-utilities" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.044948 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" containerName="extract-utilities" Oct 08 18:23:53 crc kubenswrapper[4750]: E1008 18:23:53.044958 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" containerName="registry-server" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.044965 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" containerName="registry-server" Oct 08 18:23:53 crc kubenswrapper[4750]: E1008 18:23:53.044973 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" containerName="extract-content" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.044981 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" containerName="extract-content" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.045124 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" containerName="registry-server" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.045579 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.048213 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-t52th" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.048478 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.048760 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.055541 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29"] Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.092283 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5n8g\" (UniqueName: \"kubernetes.io/projected/02480923-67a5-453b-a317-2230fa281c71-kube-api-access-c5n8g\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.092338 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02480923-67a5-453b-a317-2230fa281c71-webhook-cert\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.092387 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02480923-67a5-453b-a317-2230fa281c71-apiservice-cert\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.092434 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.117103 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52bce191-16d3-43cc-929f-dde01687a4cd" (UID: "52bce191-16d3-43cc-929f-dde01687a4cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.121027 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.193264 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5n8g\" (UniqueName: \"kubernetes.io/projected/02480923-67a5-453b-a317-2230fa281c71-kube-api-access-c5n8g\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.193300 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02480923-67a5-453b-a317-2230fa281c71-webhook-cert\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.193329 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02480923-67a5-453b-a317-2230fa281c71-apiservice-cert\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.193387 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bce191-16d3-43cc-929f-dde01687a4cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.198247 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02480923-67a5-453b-a317-2230fa281c71-webhook-cert\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.198469 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02480923-67a5-453b-a317-2230fa281c71-apiservice-cert\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.218214 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5n8g\" (UniqueName: \"kubernetes.io/projected/02480923-67a5-453b-a317-2230fa281c71-kube-api-access-c5n8g\") pod \"metallb-operator-webhook-server-66f8f8cbb5-65c29\" (UID: \"02480923-67a5-453b-a317-2230fa281c71\") " pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.316641 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q8hbp"] Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.325299 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q8hbp"] Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.390424 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:23:53 crc kubenswrapper[4750]: W1008 18:23:53.597953 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b578cc7_f656_4f3c_ae80_a54d325a597e.slice/crio-4bde9bbe8a98f483620f935695292994e57a3e47b90f3bdbdaa1c5b6ccfb3d27 WatchSource:0}: Error finding container 4bde9bbe8a98f483620f935695292994e57a3e47b90f3bdbdaa1c5b6ccfb3d27: Status 404 returned error can't find the container with id 4bde9bbe8a98f483620f935695292994e57a3e47b90f3bdbdaa1c5b6ccfb3d27 Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.600028 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk"] Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.876102 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29"] Oct 08 18:23:53 crc kubenswrapper[4750]: W1008 18:23:53.885268 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02480923_67a5_453b_a317_2230fa281c71.slice/crio-29d6510f74326b659c1d00782ffe1d61750074ee7d83250e08118a007efcc695 WatchSource:0}: Error finding container 29d6510f74326b659c1d00782ffe1d61750074ee7d83250e08118a007efcc695: Status 404 returned error can't find the container with id 29d6510f74326b659c1d00782ffe1d61750074ee7d83250e08118a007efcc695 Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.940940 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" event={"ID":"02480923-67a5-453b-a317-2230fa281c71","Type":"ContainerStarted","Data":"29d6510f74326b659c1d00782ffe1d61750074ee7d83250e08118a007efcc695"} Oct 08 18:23:53 crc kubenswrapper[4750]: I1008 18:23:53.941714 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" event={"ID":"9b578cc7-f656-4f3c-ae80-a54d325a597e","Type":"ContainerStarted","Data":"4bde9bbe8a98f483620f935695292994e57a3e47b90f3bdbdaa1c5b6ccfb3d27"} Oct 08 18:23:54 crc kubenswrapper[4750]: I1008 18:23:54.743136 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bce191-16d3-43cc-929f-dde01687a4cd" path="/var/lib/kubelet/pods/52bce191-16d3-43cc-929f-dde01687a4cd/volumes" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.616051 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mszh9"] Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.617644 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.630523 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mszh9"] Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.741075 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-utilities\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.741114 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qtzw\" (UniqueName: \"kubernetes.io/projected/03080e5c-9e1c-4948-92a9-f9d160880ba4-kube-api-access-4qtzw\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.741139 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-catalog-content\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.841952 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-utilities\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.842017 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qtzw\" (UniqueName: \"kubernetes.io/projected/03080e5c-9e1c-4948-92a9-f9d160880ba4-kube-api-access-4qtzw\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.842051 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-catalog-content\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.842674 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-catalog-content\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.842690 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-utilities\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.884289 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qtzw\" (UniqueName: \"kubernetes.io/projected/03080e5c-9e1c-4948-92a9-f9d160880ba4-kube-api-access-4qtzw\") pod \"community-operators-mszh9\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:23:56 crc kubenswrapper[4750]: I1008 18:23:56.945963 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:24:00 crc kubenswrapper[4750]: I1008 18:24:00.271413 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mszh9"] Oct 08 18:24:00 crc kubenswrapper[4750]: W1008 18:24:00.279704 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03080e5c_9e1c_4948_92a9_f9d160880ba4.slice/crio-5d285b0720006ba482ca9f66759684cb13cac1b49df00ed95ad049036ca19e0f WatchSource:0}: Error finding container 5d285b0720006ba482ca9f66759684cb13cac1b49df00ed95ad049036ca19e0f: Status 404 returned error can't find the container with id 5d285b0720006ba482ca9f66759684cb13cac1b49df00ed95ad049036ca19e0f Oct 08 18:24:00 crc kubenswrapper[4750]: I1008 18:24:00.987765 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" event={"ID":"9b578cc7-f656-4f3c-ae80-a54d325a597e","Type":"ContainerStarted","Data":"c1bcff6c1475eadfea78189257d86e8b152ca59c12291548049ae795931306f0"} Oct 08 18:24:00 crc kubenswrapper[4750]: I1008 18:24:00.988094 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:24:00 crc kubenswrapper[4750]: I1008 18:24:00.989237 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" event={"ID":"02480923-67a5-453b-a317-2230fa281c71","Type":"ContainerStarted","Data":"06008215823f8125691b9455a69d142dfc533ac2d943c25875dc08ca54edf732"} Oct 08 18:24:00 crc kubenswrapper[4750]: I1008 18:24:00.989391 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:24:00 crc kubenswrapper[4750]: I1008 18:24:00.991146 4750 generic.go:334] "Generic (PLEG): container finished" podID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerID="46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5" exitCode=0 Oct 08 18:24:00 crc kubenswrapper[4750]: I1008 18:24:00.991196 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mszh9" event={"ID":"03080e5c-9e1c-4948-92a9-f9d160880ba4","Type":"ContainerDied","Data":"46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5"} Oct 08 18:24:00 crc kubenswrapper[4750]: I1008 18:24:00.991222 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mszh9" event={"ID":"03080e5c-9e1c-4948-92a9-f9d160880ba4","Type":"ContainerStarted","Data":"5d285b0720006ba482ca9f66759684cb13cac1b49df00ed95ad049036ca19e0f"} Oct 08 18:24:01 crc kubenswrapper[4750]: I1008 18:24:01.010019 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" podStartSLOduration=2.738090284 podStartE2EDuration="9.010002729s" podCreationTimestamp="2025-10-08 18:23:52 +0000 UTC" firstStartedPulling="2025-10-08 18:23:53.600614972 +0000 UTC m=+789.513585985" lastFinishedPulling="2025-10-08 18:23:59.872527417 +0000 UTC m=+795.785498430" observedRunningTime="2025-10-08 18:24:01.007231978 +0000 UTC m=+796.920203011" watchObservedRunningTime="2025-10-08 18:24:01.010002729 +0000 UTC m=+796.922973752" Oct 08 18:24:01 crc kubenswrapper[4750]: I1008 18:24:01.047904 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" podStartSLOduration=2.042892354 podStartE2EDuration="8.047881115s" podCreationTimestamp="2025-10-08 18:23:53 +0000 UTC" firstStartedPulling="2025-10-08 18:23:53.886816453 +0000 UTC m=+789.799787466" lastFinishedPulling="2025-10-08 18:23:59.891805214 +0000 UTC m=+795.804776227" observedRunningTime="2025-10-08 18:24:01.043330428 +0000 UTC m=+796.956301451" watchObservedRunningTime="2025-10-08 18:24:01.047881115 +0000 UTC m=+796.960852128" Oct 08 18:24:03 crc kubenswrapper[4750]: I1008 18:24:03.002296 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mszh9" event={"ID":"03080e5c-9e1c-4948-92a9-f9d160880ba4","Type":"ContainerStarted","Data":"7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414"} Oct 08 18:24:04 crc kubenswrapper[4750]: I1008 18:24:04.008786 4750 generic.go:334] "Generic (PLEG): container finished" podID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerID="7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414" exitCode=0 Oct 08 18:24:04 crc kubenswrapper[4750]: I1008 18:24:04.008834 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mszh9" event={"ID":"03080e5c-9e1c-4948-92a9-f9d160880ba4","Type":"ContainerDied","Data":"7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414"} Oct 08 18:24:06 crc kubenswrapper[4750]: I1008 18:24:06.020011 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mszh9" event={"ID":"03080e5c-9e1c-4948-92a9-f9d160880ba4","Type":"ContainerStarted","Data":"a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8"} Oct 08 18:24:06 crc kubenswrapper[4750]: I1008 18:24:06.037948 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mszh9" podStartSLOduration=5.839547052 podStartE2EDuration="10.037930979s" podCreationTimestamp="2025-10-08 18:23:56 +0000 UTC" firstStartedPulling="2025-10-08 18:24:00.993500044 +0000 UTC m=+796.906471057" lastFinishedPulling="2025-10-08 18:24:05.191883971 +0000 UTC m=+801.104854984" observedRunningTime="2025-10-08 18:24:06.034346677 +0000 UTC m=+801.947317690" watchObservedRunningTime="2025-10-08 18:24:06.037930979 +0000 UTC m=+801.950901992" Oct 08 18:24:06 crc kubenswrapper[4750]: I1008 18:24:06.946836 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:24:06 crc kubenswrapper[4750]: I1008 18:24:06.946895 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:24:07 crc kubenswrapper[4750]: I1008 18:24:07.981780 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mszh9" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="registry-server" probeResult="failure" output=< Oct 08 18:24:07 crc kubenswrapper[4750]: timeout: failed to connect service ":50051" within 1s Oct 08 18:24:07 crc kubenswrapper[4750]: > Oct 08 18:24:13 crc kubenswrapper[4750]: I1008 18:24:13.394487 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66f8f8cbb5-65c29" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.412174 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zdmgh"] Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.413899 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.429205 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zdmgh"] Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.492689 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-utilities\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.492812 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-catalog-content\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.492992 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlprz\" (UniqueName: \"kubernetes.io/projected/307381ad-744a-427b-a1b7-2b57d79c1598-kube-api-access-mlprz\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.594322 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlprz\" (UniqueName: \"kubernetes.io/projected/307381ad-744a-427b-a1b7-2b57d79c1598-kube-api-access-mlprz\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.594388 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-utilities\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.594420 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-catalog-content\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.595008 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-catalog-content\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.595276 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-utilities\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.619390 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlprz\" (UniqueName: \"kubernetes.io/projected/307381ad-744a-427b-a1b7-2b57d79c1598-kube-api-access-mlprz\") pod \"certified-operators-zdmgh\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.734274 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:16 crc kubenswrapper[4750]: I1008 18:24:16.995890 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:24:17 crc kubenswrapper[4750]: I1008 18:24:17.056508 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:24:17 crc kubenswrapper[4750]: I1008 18:24:17.210872 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zdmgh"] Oct 08 18:24:18 crc kubenswrapper[4750]: I1008 18:24:18.091512 4750 generic.go:334] "Generic (PLEG): container finished" podID="307381ad-744a-427b-a1b7-2b57d79c1598" containerID="27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39" exitCode=0 Oct 08 18:24:18 crc kubenswrapper[4750]: I1008 18:24:18.091565 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmgh" event={"ID":"307381ad-744a-427b-a1b7-2b57d79c1598","Type":"ContainerDied","Data":"27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39"} Oct 08 18:24:18 crc kubenswrapper[4750]: I1008 18:24:18.092105 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmgh" event={"ID":"307381ad-744a-427b-a1b7-2b57d79c1598","Type":"ContainerStarted","Data":"5108d94b6079a933e330ca741b3e3fc594d9c832be7427ae26a791a1871e9df4"} Oct 08 18:24:20 crc kubenswrapper[4750]: I1008 18:24:20.105673 4750 generic.go:334] "Generic (PLEG): container finished" podID="307381ad-744a-427b-a1b7-2b57d79c1598" containerID="62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c" exitCode=0 Oct 08 18:24:20 crc kubenswrapper[4750]: I1008 18:24:20.105764 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmgh" event={"ID":"307381ad-744a-427b-a1b7-2b57d79c1598","Type":"ContainerDied","Data":"62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c"} Oct 08 18:24:20 crc kubenswrapper[4750]: I1008 18:24:20.601700 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mszh9"] Oct 08 18:24:20 crc kubenswrapper[4750]: I1008 18:24:20.602008 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mszh9" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="registry-server" containerID="cri-o://a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8" gracePeriod=2 Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.081216 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.132778 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmgh" event={"ID":"307381ad-744a-427b-a1b7-2b57d79c1598","Type":"ContainerStarted","Data":"fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a"} Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.134764 4750 generic.go:334] "Generic (PLEG): container finished" podID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerID="a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8" exitCode=0 Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.134807 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mszh9" event={"ID":"03080e5c-9e1c-4948-92a9-f9d160880ba4","Type":"ContainerDied","Data":"a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8"} Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.134837 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mszh9" event={"ID":"03080e5c-9e1c-4948-92a9-f9d160880ba4","Type":"ContainerDied","Data":"5d285b0720006ba482ca9f66759684cb13cac1b49df00ed95ad049036ca19e0f"} Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.134855 4750 scope.go:117] "RemoveContainer" containerID="a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.134867 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mszh9" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.157302 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-utilities\") pod \"03080e5c-9e1c-4948-92a9-f9d160880ba4\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.157387 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qtzw\" (UniqueName: \"kubernetes.io/projected/03080e5c-9e1c-4948-92a9-f9d160880ba4-kube-api-access-4qtzw\") pod \"03080e5c-9e1c-4948-92a9-f9d160880ba4\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.157433 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-catalog-content\") pod \"03080e5c-9e1c-4948-92a9-f9d160880ba4\" (UID: \"03080e5c-9e1c-4948-92a9-f9d160880ba4\") " Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.158365 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-utilities" (OuterVolumeSpecName: "utilities") pod "03080e5c-9e1c-4948-92a9-f9d160880ba4" (UID: "03080e5c-9e1c-4948-92a9-f9d160880ba4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.162793 4750 scope.go:117] "RemoveContainer" containerID="7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.163998 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03080e5c-9e1c-4948-92a9-f9d160880ba4-kube-api-access-4qtzw" (OuterVolumeSpecName: "kube-api-access-4qtzw") pod "03080e5c-9e1c-4948-92a9-f9d160880ba4" (UID: "03080e5c-9e1c-4948-92a9-f9d160880ba4"). InnerVolumeSpecName "kube-api-access-4qtzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.166435 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zdmgh" podStartSLOduration=2.654163803 podStartE2EDuration="5.166418278s" podCreationTimestamp="2025-10-08 18:24:16 +0000 UTC" firstStartedPulling="2025-10-08 18:24:18.09292571 +0000 UTC m=+814.005896723" lastFinishedPulling="2025-10-08 18:24:20.605180185 +0000 UTC m=+816.518151198" observedRunningTime="2025-10-08 18:24:21.163065452 +0000 UTC m=+817.076036465" watchObservedRunningTime="2025-10-08 18:24:21.166418278 +0000 UTC m=+817.079389291" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.212037 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03080e5c-9e1c-4948-92a9-f9d160880ba4" (UID: "03080e5c-9e1c-4948-92a9-f9d160880ba4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.221794 4750 scope.go:117] "RemoveContainer" containerID="46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.257430 4750 scope.go:117] "RemoveContainer" containerID="a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.259262 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qtzw\" (UniqueName: \"kubernetes.io/projected/03080e5c-9e1c-4948-92a9-f9d160880ba4-kube-api-access-4qtzw\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.259304 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.259317 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03080e5c-9e1c-4948-92a9-f9d160880ba4-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:21 crc kubenswrapper[4750]: E1008 18:24:21.261096 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8\": container with ID starting with a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8 not found: ID does not exist" containerID="a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.261166 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8"} err="failed to get container status \"a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8\": rpc error: code = NotFound desc = could not find container \"a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8\": container with ID starting with a54731614529f5dbef50204b0f04656afcf8e50ff22e5eb4fa4450365cbcb7f8 not found: ID does not exist" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.261200 4750 scope.go:117] "RemoveContainer" containerID="7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414" Oct 08 18:24:21 crc kubenswrapper[4750]: E1008 18:24:21.264799 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414\": container with ID starting with 7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414 not found: ID does not exist" containerID="7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.264831 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414"} err="failed to get container status \"7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414\": rpc error: code = NotFound desc = could not find container \"7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414\": container with ID starting with 7c4c1e5b004ed5bb1320a914648edb4d8307dd89f935c48cbeb9b8917ed98414 not found: ID does not exist" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.264852 4750 scope.go:117] "RemoveContainer" containerID="46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5" Oct 08 18:24:21 crc kubenswrapper[4750]: E1008 18:24:21.268410 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5\": container with ID starting with 46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5 not found: ID does not exist" containerID="46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.268445 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5"} err="failed to get container status \"46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5\": rpc error: code = NotFound desc = could not find container \"46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5\": container with ID starting with 46b070d6024ba5166533eb27d7befd624139bc944850105636bc9de5367e6ed5 not found: ID does not exist" Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.463280 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mszh9"] Oct 08 18:24:21 crc kubenswrapper[4750]: I1008 18:24:21.471401 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mszh9"] Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.401289 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fgnf7"] Oct 08 18:24:22 crc kubenswrapper[4750]: E1008 18:24:22.401726 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="registry-server" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.401737 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="registry-server" Oct 08 18:24:22 crc kubenswrapper[4750]: E1008 18:24:22.401747 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="extract-utilities" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.401753 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="extract-utilities" Oct 08 18:24:22 crc kubenswrapper[4750]: E1008 18:24:22.401761 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="extract-content" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.401768 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="extract-content" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.401890 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" containerName="registry-server" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.402600 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.414921 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgnf7"] Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.474258 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-catalog-content\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.474299 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-utilities\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.474321 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4t6d\" (UniqueName: \"kubernetes.io/projected/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-kube-api-access-b4t6d\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.576107 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-catalog-content\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.576162 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-utilities\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.576191 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4t6d\" (UniqueName: \"kubernetes.io/projected/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-kube-api-access-b4t6d\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.576693 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-catalog-content\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.576751 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-utilities\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.594134 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4t6d\" (UniqueName: \"kubernetes.io/projected/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-kube-api-access-b4t6d\") pod \"redhat-marketplace-fgnf7\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.716362 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:22 crc kubenswrapper[4750]: I1008 18:24:22.752490 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03080e5c-9e1c-4948-92a9-f9d160880ba4" path="/var/lib/kubelet/pods/03080e5c-9e1c-4948-92a9-f9d160880ba4/volumes" Oct 08 18:24:23 crc kubenswrapper[4750]: I1008 18:24:23.135281 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgnf7"] Oct 08 18:24:23 crc kubenswrapper[4750]: I1008 18:24:23.156464 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgnf7" event={"ID":"6b97c98a-f56a-4654-a44b-f7c1ba5f006c","Type":"ContainerStarted","Data":"1b70c5c86f27dc1dc3e736254f7773e95cc96faee598390ce556fd8592779a16"} Oct 08 18:24:24 crc kubenswrapper[4750]: I1008 18:24:24.163070 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerID="740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345" exitCode=0 Oct 08 18:24:24 crc kubenswrapper[4750]: I1008 18:24:24.163122 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgnf7" event={"ID":"6b97c98a-f56a-4654-a44b-f7c1ba5f006c","Type":"ContainerDied","Data":"740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345"} Oct 08 18:24:26 crc kubenswrapper[4750]: I1008 18:24:26.192341 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerID="8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62" exitCode=0 Oct 08 18:24:26 crc kubenswrapper[4750]: I1008 18:24:26.193169 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgnf7" event={"ID":"6b97c98a-f56a-4654-a44b-f7c1ba5f006c","Type":"ContainerDied","Data":"8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62"} Oct 08 18:24:26 crc kubenswrapper[4750]: I1008 18:24:26.741865 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:26 crc kubenswrapper[4750]: I1008 18:24:26.741905 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:26 crc kubenswrapper[4750]: I1008 18:24:26.769753 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:27 crc kubenswrapper[4750]: I1008 18:24:27.201239 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgnf7" event={"ID":"6b97c98a-f56a-4654-a44b-f7c1ba5f006c","Type":"ContainerStarted","Data":"17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83"} Oct 08 18:24:27 crc kubenswrapper[4750]: I1008 18:24:27.221423 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fgnf7" podStartSLOduration=2.5601967759999997 podStartE2EDuration="5.221403927s" podCreationTimestamp="2025-10-08 18:24:22 +0000 UTC" firstStartedPulling="2025-10-08 18:24:24.164945167 +0000 UTC m=+820.077916180" lastFinishedPulling="2025-10-08 18:24:26.826152328 +0000 UTC m=+822.739123331" observedRunningTime="2025-10-08 18:24:27.21723592 +0000 UTC m=+823.130206933" watchObservedRunningTime="2025-10-08 18:24:27.221403927 +0000 UTC m=+823.134374950" Oct 08 18:24:27 crc kubenswrapper[4750]: I1008 18:24:27.250241 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.201904 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zdmgh"] Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.210797 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zdmgh" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" containerName="registry-server" containerID="cri-o://fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a" gracePeriod=2 Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.553166 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.659488 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlprz\" (UniqueName: \"kubernetes.io/projected/307381ad-744a-427b-a1b7-2b57d79c1598-kube-api-access-mlprz\") pod \"307381ad-744a-427b-a1b7-2b57d79c1598\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.659578 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-utilities\") pod \"307381ad-744a-427b-a1b7-2b57d79c1598\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.659601 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-catalog-content\") pod \"307381ad-744a-427b-a1b7-2b57d79c1598\" (UID: \"307381ad-744a-427b-a1b7-2b57d79c1598\") " Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.660244 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-utilities" (OuterVolumeSpecName: "utilities") pod "307381ad-744a-427b-a1b7-2b57d79c1598" (UID: "307381ad-744a-427b-a1b7-2b57d79c1598"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.667802 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307381ad-744a-427b-a1b7-2b57d79c1598-kube-api-access-mlprz" (OuterVolumeSpecName: "kube-api-access-mlprz") pod "307381ad-744a-427b-a1b7-2b57d79c1598" (UID: "307381ad-744a-427b-a1b7-2b57d79c1598"). InnerVolumeSpecName "kube-api-access-mlprz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.706279 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "307381ad-744a-427b-a1b7-2b57d79c1598" (UID: "307381ad-744a-427b-a1b7-2b57d79c1598"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.761691 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlprz\" (UniqueName: \"kubernetes.io/projected/307381ad-744a-427b-a1b7-2b57d79c1598-kube-api-access-mlprz\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.761729 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:29 crc kubenswrapper[4750]: I1008 18:24:29.761745 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307381ad-744a-427b-a1b7-2b57d79c1598-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.217902 4750 generic.go:334] "Generic (PLEG): container finished" podID="307381ad-744a-427b-a1b7-2b57d79c1598" containerID="fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a" exitCode=0 Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.217952 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmgh" event={"ID":"307381ad-744a-427b-a1b7-2b57d79c1598","Type":"ContainerDied","Data":"fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a"} Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.217980 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdmgh" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.217984 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdmgh" event={"ID":"307381ad-744a-427b-a1b7-2b57d79c1598","Type":"ContainerDied","Data":"5108d94b6079a933e330ca741b3e3fc594d9c832be7427ae26a791a1871e9df4"} Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.218021 4750 scope.go:117] "RemoveContainer" containerID="fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.230734 4750 scope.go:117] "RemoveContainer" containerID="62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.242468 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zdmgh"] Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.250687 4750 scope.go:117] "RemoveContainer" containerID="27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.257169 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zdmgh"] Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.271976 4750 scope.go:117] "RemoveContainer" containerID="fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a" Oct 08 18:24:30 crc kubenswrapper[4750]: E1008 18:24:30.272538 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a\": container with ID starting with fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a not found: ID does not exist" containerID="fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.272592 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a"} err="failed to get container status \"fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a\": rpc error: code = NotFound desc = could not find container \"fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a\": container with ID starting with fdaaa2a75b64cf4704ffde2ece7d076894c285380ef6f306238ac17d2f3d958a not found: ID does not exist" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.272633 4750 scope.go:117] "RemoveContainer" containerID="62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c" Oct 08 18:24:30 crc kubenswrapper[4750]: E1008 18:24:30.272915 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c\": container with ID starting with 62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c not found: ID does not exist" containerID="62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.272933 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c"} err="failed to get container status \"62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c\": rpc error: code = NotFound desc = could not find container \"62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c\": container with ID starting with 62fa3545dd1a60e8699694a6b546e4bca3f96d27fd84630ef2e206e220178f5c not found: ID does not exist" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.272948 4750 scope.go:117] "RemoveContainer" containerID="27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39" Oct 08 18:24:30 crc kubenswrapper[4750]: E1008 18:24:30.273334 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39\": container with ID starting with 27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39 not found: ID does not exist" containerID="27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.273382 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39"} err="failed to get container status \"27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39\": rpc error: code = NotFound desc = could not find container \"27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39\": container with ID starting with 27eddfbc9e3274c74919d0024b5fcb12a2786e5740eca5743b90d9570150fd39 not found: ID does not exist" Oct 08 18:24:30 crc kubenswrapper[4750]: I1008 18:24:30.746872 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" path="/var/lib/kubelet/pods/307381ad-744a-427b-a1b7-2b57d79c1598/volumes" Oct 08 18:24:32 crc kubenswrapper[4750]: I1008 18:24:32.717235 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:32 crc kubenswrapper[4750]: I1008 18:24:32.717299 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:32 crc kubenswrapper[4750]: I1008 18:24:32.753414 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.125087 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7767688d85-nqdwk" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.292211 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.945708 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-4z9bv"] Oct 08 18:24:33 crc kubenswrapper[4750]: E1008 18:24:33.945925 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" containerName="extract-content" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.945940 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" containerName="extract-content" Oct 08 18:24:33 crc kubenswrapper[4750]: E1008 18:24:33.945955 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" containerName="registry-server" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.945962 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" containerName="registry-server" Oct 08 18:24:33 crc kubenswrapper[4750]: E1008 18:24:33.945969 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" containerName="extract-utilities" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.945975 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" containerName="extract-utilities" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.946087 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="307381ad-744a-427b-a1b7-2b57d79c1598" containerName="registry-server" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.948403 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.950505 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4mwzc" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.951654 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.955041 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph"] Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.955899 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.955915 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.957499 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 08 18:24:33 crc kubenswrapper[4750]: I1008 18:24:33.965545 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph"] Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006335 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f040b00-d031-47f3-be72-8bdece6ddf78-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xwpph\" (UID: \"3f040b00-d031-47f3-be72-8bdece6ddf78\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006374 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-conf\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006418 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-sockets\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006482 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006511 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvzc9\" (UniqueName: \"kubernetes.io/projected/3f040b00-d031-47f3-be72-8bdece6ddf78-kube-api-access-dvzc9\") pod \"frr-k8s-webhook-server-64bf5d555-xwpph\" (UID: \"3f040b00-d031-47f3-be72-8bdece6ddf78\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006541 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-startup\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006662 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics-certs\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006737 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-reloader\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.006798 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mbw\" (UniqueName: \"kubernetes.io/projected/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-kube-api-access-v7mbw\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.043633 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-sqjmd"] Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.044669 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.066485 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.067822 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-h2hlh" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.068238 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.068284 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108470 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108543 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics-certs\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108593 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-metrics-certs\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108620 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-reloader\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.108653 4750 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.108701 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics-certs podName:c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1 nodeName:}" failed. No retries permitted until 2025-10-08 18:24:34.608684219 +0000 UTC m=+830.521655232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics-certs") pod "frr-k8s-4z9bv" (UID: "c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1") : secret "frr-k8s-certs-secret" not found Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108658 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mbw\" (UniqueName: \"kubernetes.io/projected/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-kube-api-access-v7mbw\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108843 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f040b00-d031-47f3-be72-8bdece6ddf78-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xwpph\" (UID: \"3f040b00-d031-47f3-be72-8bdece6ddf78\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108875 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm54l\" (UniqueName: \"kubernetes.io/projected/85a087dd-81ec-4b22-bed2-e64d25106913-kube-api-access-bm54l\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108900 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-conf\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.108973 4750 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.108991 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-sockets\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.109027 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f040b00-d031-47f3-be72-8bdece6ddf78-cert podName:3f040b00-d031-47f3-be72-8bdece6ddf78 nodeName:}" failed. No retries permitted until 2025-10-08 18:24:34.609009507 +0000 UTC m=+830.521980520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f040b00-d031-47f3-be72-8bdece6ddf78-cert") pod "frr-k8s-webhook-server-64bf5d555-xwpph" (UID: "3f040b00-d031-47f3-be72-8bdece6ddf78") : secret "frr-k8s-webhook-server-cert" not found Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109042 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109070 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/85a087dd-81ec-4b22-bed2-e64d25106913-metallb-excludel2\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109083 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-reloader\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109101 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvzc9\" (UniqueName: \"kubernetes.io/projected/3f040b00-d031-47f3-be72-8bdece6ddf78-kube-api-access-dvzc9\") pod \"frr-k8s-webhook-server-64bf5d555-xwpph\" (UID: \"3f040b00-d031-47f3-be72-8bdece6ddf78\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109152 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-startup\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109290 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-sockets\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109416 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-conf\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109468 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.109998 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-frr-startup\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.112703 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-8dgbs"] Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.113586 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.116903 4750 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.132432 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-8dgbs"] Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.144415 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mbw\" (UniqueName: \"kubernetes.io/projected/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-kube-api-access-v7mbw\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.155822 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvzc9\" (UniqueName: \"kubernetes.io/projected/3f040b00-d031-47f3-be72-8bdece6ddf78-kube-api-access-dvzc9\") pod \"frr-k8s-webhook-server-64bf5d555-xwpph\" (UID: \"3f040b00-d031-47f3-be72-8bdece6ddf78\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.210237 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm54l\" (UniqueName: \"kubernetes.io/projected/85a087dd-81ec-4b22-bed2-e64d25106913-kube-api-access-bm54l\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.210296 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/85a087dd-81ec-4b22-bed2-e64d25106913-metallb-excludel2\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.210324 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.210359 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-metrics-certs\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.210508 4750 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.210574 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-metrics-certs podName:85a087dd-81ec-4b22-bed2-e64d25106913 nodeName:}" failed. No retries permitted until 2025-10-08 18:24:34.710542801 +0000 UTC m=+830.623513814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-metrics-certs") pod "speaker-sqjmd" (UID: "85a087dd-81ec-4b22-bed2-e64d25106913") : secret "speaker-certs-secret" not found Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.211465 4750 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.211505 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist podName:85a087dd-81ec-4b22-bed2-e64d25106913 nodeName:}" failed. No retries permitted until 2025-10-08 18:24:34.711496726 +0000 UTC m=+830.624467739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist") pod "speaker-sqjmd" (UID: "85a087dd-81ec-4b22-bed2-e64d25106913") : secret "metallb-memberlist" not found Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.211423 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/85a087dd-81ec-4b22-bed2-e64d25106913-metallb-excludel2\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.213920 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgnf7"] Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.254889 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm54l\" (UniqueName: \"kubernetes.io/projected/85a087dd-81ec-4b22-bed2-e64d25106913-kube-api-access-bm54l\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.311641 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9094b73f-9ae7-436d-85cd-4d3feade13ae-metrics-certs\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.311984 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9094b73f-9ae7-436d-85cd-4d3feade13ae-cert\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.312329 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hb4c\" (UniqueName: \"kubernetes.io/projected/9094b73f-9ae7-436d-85cd-4d3feade13ae-kube-api-access-7hb4c\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.413338 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hb4c\" (UniqueName: \"kubernetes.io/projected/9094b73f-9ae7-436d-85cd-4d3feade13ae-kube-api-access-7hb4c\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.413433 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9094b73f-9ae7-436d-85cd-4d3feade13ae-metrics-certs\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.413502 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9094b73f-9ae7-436d-85cd-4d3feade13ae-cert\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.418752 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9094b73f-9ae7-436d-85cd-4d3feade13ae-metrics-certs\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.419314 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9094b73f-9ae7-436d-85cd-4d3feade13ae-cert\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.430740 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hb4c\" (UniqueName: \"kubernetes.io/projected/9094b73f-9ae7-436d-85cd-4d3feade13ae-kube-api-access-7hb4c\") pod \"controller-68d546b9d8-8dgbs\" (UID: \"9094b73f-9ae7-436d-85cd-4d3feade13ae\") " pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.616810 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f040b00-d031-47f3-be72-8bdece6ddf78-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xwpph\" (UID: \"3f040b00-d031-47f3-be72-8bdece6ddf78\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.616908 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics-certs\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.620403 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1-metrics-certs\") pod \"frr-k8s-4z9bv\" (UID: \"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1\") " pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.621123 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f040b00-d031-47f3-be72-8bdece6ddf78-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xwpph\" (UID: \"3f040b00-d031-47f3-be72-8bdece6ddf78\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.717922 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.717986 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-metrics-certs\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.718191 4750 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 18:24:34 crc kubenswrapper[4750]: E1008 18:24:34.718330 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist podName:85a087dd-81ec-4b22-bed2-e64d25106913 nodeName:}" failed. No retries permitted until 2025-10-08 18:24:35.718296347 +0000 UTC m=+831.631267370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist") pod "speaker-sqjmd" (UID: "85a087dd-81ec-4b22-bed2-e64d25106913") : secret "metallb-memberlist" not found Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.722379 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-metrics-certs\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.728079 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.878200 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:34 crc kubenswrapper[4750]: I1008 18:24:34.886733 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.222149 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-8dgbs"] Oct 08 18:24:35 crc kubenswrapper[4750]: W1008 18:24:35.226970 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9094b73f_9ae7_436d_85cd_4d3feade13ae.slice/crio-70716b789a55f6d330c4cd47cbd65cd6018f43e9c27943bb177ccb571fcf1163 WatchSource:0}: Error finding container 70716b789a55f6d330c4cd47cbd65cd6018f43e9c27943bb177ccb571fcf1163: Status 404 returned error can't find the container with id 70716b789a55f6d330c4cd47cbd65cd6018f43e9c27943bb177ccb571fcf1163 Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.252307 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerStarted","Data":"2bdeca2bb65c81f80d5d0b3a85406a6786f1272039be18d48a7b4ded806d65b1"} Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.253849 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fgnf7" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerName="registry-server" containerID="cri-o://17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83" gracePeriod=2 Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.254256 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8dgbs" event={"ID":"9094b73f-9ae7-436d-85cd-4d3feade13ae","Type":"ContainerStarted","Data":"70716b789a55f6d330c4cd47cbd65cd6018f43e9c27943bb177ccb571fcf1163"} Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.334863 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph"] Oct 08 18:24:35 crc kubenswrapper[4750]: W1008 18:24:35.372997 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f040b00_d031_47f3_be72_8bdece6ddf78.slice/crio-59eedf0f9fd88cc66870f093c5b0a38b673954378b0ca3d2f6bb0c4c85b72fe6 WatchSource:0}: Error finding container 59eedf0f9fd88cc66870f093c5b0a38b673954378b0ca3d2f6bb0c4c85b72fe6: Status 404 returned error can't find the container with id 59eedf0f9fd88cc66870f093c5b0a38b673954378b0ca3d2f6bb0c4c85b72fe6 Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.761305 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.769690 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/85a087dd-81ec-4b22-bed2-e64d25106913-memberlist\") pod \"speaker-sqjmd\" (UID: \"85a087dd-81ec-4b22-bed2-e64d25106913\") " pod="metallb-system/speaker-sqjmd" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.784488 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.858674 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-sqjmd" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.863069 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-catalog-content\") pod \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.863240 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4t6d\" (UniqueName: \"kubernetes.io/projected/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-kube-api-access-b4t6d\") pod \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.863441 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-utilities\") pod \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\" (UID: \"6b97c98a-f56a-4654-a44b-f7c1ba5f006c\") " Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.864842 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-utilities" (OuterVolumeSpecName: "utilities") pod "6b97c98a-f56a-4654-a44b-f7c1ba5f006c" (UID: "6b97c98a-f56a-4654-a44b-f7c1ba5f006c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.867977 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-kube-api-access-b4t6d" (OuterVolumeSpecName: "kube-api-access-b4t6d") pod "6b97c98a-f56a-4654-a44b-f7c1ba5f006c" (UID: "6b97c98a-f56a-4654-a44b-f7c1ba5f006c"). InnerVolumeSpecName "kube-api-access-b4t6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.877589 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b97c98a-f56a-4654-a44b-f7c1ba5f006c" (UID: "6b97c98a-f56a-4654-a44b-f7c1ba5f006c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.965165 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4t6d\" (UniqueName: \"kubernetes.io/projected/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-kube-api-access-b4t6d\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.965201 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:35 crc kubenswrapper[4750]: I1008 18:24:35.965213 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b97c98a-f56a-4654-a44b-f7c1ba5f006c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.261585 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-sqjmd" event={"ID":"85a087dd-81ec-4b22-bed2-e64d25106913","Type":"ContainerStarted","Data":"a808b3d133da93d6998657fc60a0b87c69fcd62d058ddae6cf98d6db55a31300"} Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.265806 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerID="17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83" exitCode=0 Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.265851 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgnf7" event={"ID":"6b97c98a-f56a-4654-a44b-f7c1ba5f006c","Type":"ContainerDied","Data":"17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83"} Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.265870 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgnf7" event={"ID":"6b97c98a-f56a-4654-a44b-f7c1ba5f006c","Type":"ContainerDied","Data":"1b70c5c86f27dc1dc3e736254f7773e95cc96faee598390ce556fd8592779a16"} Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.265889 4750 scope.go:117] "RemoveContainer" containerID="17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.265992 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgnf7" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.274954 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" event={"ID":"3f040b00-d031-47f3-be72-8bdece6ddf78","Type":"ContainerStarted","Data":"59eedf0f9fd88cc66870f093c5b0a38b673954378b0ca3d2f6bb0c4c85b72fe6"} Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.278599 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8dgbs" event={"ID":"9094b73f-9ae7-436d-85cd-4d3feade13ae","Type":"ContainerStarted","Data":"5a1f53f239cab609abbea463ed74ac606d285428ce58f33d69a0ca21cb46ff3d"} Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.278633 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8dgbs" event={"ID":"9094b73f-9ae7-436d-85cd-4d3feade13ae","Type":"ContainerStarted","Data":"d3a15a5daa50ef0fc21514521e7919669d2f8a361514a325d52dc5aae14cb0ed"} Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.293324 4750 scope.go:117] "RemoveContainer" containerID="8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.299501 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgnf7"] Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.304289 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgnf7"] Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.332533 4750 scope.go:117] "RemoveContainer" containerID="740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.350822 4750 scope.go:117] "RemoveContainer" containerID="17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83" Oct 08 18:24:36 crc kubenswrapper[4750]: E1008 18:24:36.354245 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83\": container with ID starting with 17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83 not found: ID does not exist" containerID="17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.354295 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83"} err="failed to get container status \"17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83\": rpc error: code = NotFound desc = could not find container \"17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83\": container with ID starting with 17a41ebae7c1086dca72c7b0736b5648052df1242b75ed74019b74252bfcfe83 not found: ID does not exist" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.354330 4750 scope.go:117] "RemoveContainer" containerID="8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62" Oct 08 18:24:36 crc kubenswrapper[4750]: E1008 18:24:36.354854 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62\": container with ID starting with 8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62 not found: ID does not exist" containerID="8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.354939 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62"} err="failed to get container status \"8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62\": rpc error: code = NotFound desc = could not find container \"8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62\": container with ID starting with 8bcacb09927e3c74974f42e188d66c054b2c362087969282739b8c94f1cbcc62 not found: ID does not exist" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.354980 4750 scope.go:117] "RemoveContainer" containerID="740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345" Oct 08 18:24:36 crc kubenswrapper[4750]: E1008 18:24:36.355400 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345\": container with ID starting with 740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345 not found: ID does not exist" containerID="740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.355441 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345"} err="failed to get container status \"740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345\": rpc error: code = NotFound desc = could not find container \"740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345\": container with ID starting with 740c9cc85a6b58d95eb5c828dafce7fe2e0f7e18bb2360611d265ad50efef345 not found: ID does not exist" Oct 08 18:24:36 crc kubenswrapper[4750]: I1008 18:24:36.746694 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" path="/var/lib/kubelet/pods/6b97c98a-f56a-4654-a44b-f7c1ba5f006c/volumes" Oct 08 18:24:37 crc kubenswrapper[4750]: I1008 18:24:37.289907 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-sqjmd" event={"ID":"85a087dd-81ec-4b22-bed2-e64d25106913","Type":"ContainerStarted","Data":"e4ac159552093c20c520c1138e2d7aef0b4b721abab137e3c13bfc1d6d6d1dd6"} Oct 08 18:24:37 crc kubenswrapper[4750]: I1008 18:24:37.290274 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-sqjmd" Oct 08 18:24:37 crc kubenswrapper[4750]: I1008 18:24:37.290293 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-sqjmd" event={"ID":"85a087dd-81ec-4b22-bed2-e64d25106913","Type":"ContainerStarted","Data":"1cc1dca6a86bd9a42e7ac83f6fe89b227027a4420d6936d86fe4f01253efccfe"} Oct 08 18:24:37 crc kubenswrapper[4750]: I1008 18:24:37.292447 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:37 crc kubenswrapper[4750]: I1008 18:24:37.307111 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-sqjmd" podStartSLOduration=3.3070939839999998 podStartE2EDuration="3.307093984s" podCreationTimestamp="2025-10-08 18:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:24:37.304304322 +0000 UTC m=+833.217275335" watchObservedRunningTime="2025-10-08 18:24:37.307093984 +0000 UTC m=+833.220064987" Oct 08 18:24:37 crc kubenswrapper[4750]: I1008 18:24:37.307880 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-8dgbs" podStartSLOduration=3.307873005 podStartE2EDuration="3.307873005s" podCreationTimestamp="2025-10-08 18:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:24:36.320929799 +0000 UTC m=+832.233900822" watchObservedRunningTime="2025-10-08 18:24:37.307873005 +0000 UTC m=+833.220844018" Oct 08 18:24:43 crc kubenswrapper[4750]: I1008 18:24:43.352968 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" event={"ID":"3f040b00-d031-47f3-be72-8bdece6ddf78","Type":"ContainerStarted","Data":"4719abda79dd39819247a27933b8307b5398d056a36bbeced4c33458050637f3"} Oct 08 18:24:43 crc kubenswrapper[4750]: I1008 18:24:43.353630 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:43 crc kubenswrapper[4750]: I1008 18:24:43.355355 4750 generic.go:334] "Generic (PLEG): container finished" podID="c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1" containerID="55599ed9618c4d0424c387a7e18dec842f9dc02fe19bf5a79b2f0b8a53be4313" exitCode=0 Oct 08 18:24:43 crc kubenswrapper[4750]: I1008 18:24:43.355396 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerDied","Data":"55599ed9618c4d0424c387a7e18dec842f9dc02fe19bf5a79b2f0b8a53be4313"} Oct 08 18:24:43 crc kubenswrapper[4750]: I1008 18:24:43.368576 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" podStartSLOduration=3.284658674 podStartE2EDuration="10.368532438s" podCreationTimestamp="2025-10-08 18:24:33 +0000 UTC" firstStartedPulling="2025-10-08 18:24:35.376928449 +0000 UTC m=+831.289899482" lastFinishedPulling="2025-10-08 18:24:42.460802233 +0000 UTC m=+838.373773246" observedRunningTime="2025-10-08 18:24:43.366431565 +0000 UTC m=+839.279402588" watchObservedRunningTime="2025-10-08 18:24:43.368532438 +0000 UTC m=+839.281503451" Oct 08 18:24:44 crc kubenswrapper[4750]: I1008 18:24:44.364888 4750 generic.go:334] "Generic (PLEG): container finished" podID="c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1" containerID="01a6fbac992865d2f4118fd956a5f00c3c281ce226aedfd4308209c8bab82d2c" exitCode=0 Oct 08 18:24:44 crc kubenswrapper[4750]: I1008 18:24:44.364987 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerDied","Data":"01a6fbac992865d2f4118fd956a5f00c3c281ce226aedfd4308209c8bab82d2c"} Oct 08 18:24:44 crc kubenswrapper[4750]: E1008 18:24:44.643440 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10359b5_cc1f_4ac5_a360_4cf1a6e7bec1.slice/crio-conmon-fa318f3467e5081c26fc0459cf714f7f4fb13b1a5bfe3aa02f710ecc68e25072.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10359b5_cc1f_4ac5_a360_4cf1a6e7bec1.slice/crio-fa318f3467e5081c26fc0459cf714f7f4fb13b1a5bfe3aa02f710ecc68e25072.scope\": RecentStats: unable to find data in memory cache]" Oct 08 18:24:45 crc kubenswrapper[4750]: I1008 18:24:45.378628 4750 generic.go:334] "Generic (PLEG): container finished" podID="c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1" containerID="fa318f3467e5081c26fc0459cf714f7f4fb13b1a5bfe3aa02f710ecc68e25072" exitCode=0 Oct 08 18:24:45 crc kubenswrapper[4750]: I1008 18:24:45.378686 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerDied","Data":"fa318f3467e5081c26fc0459cf714f7f4fb13b1a5bfe3aa02f710ecc68e25072"} Oct 08 18:24:46 crc kubenswrapper[4750]: I1008 18:24:46.392825 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerStarted","Data":"88d1ee3421b8dd7fec0de17758a197fc631a4b856ce42fd3e5a8b36736dcb6dd"} Oct 08 18:24:46 crc kubenswrapper[4750]: I1008 18:24:46.394494 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerStarted","Data":"92ac99797f9c16841238d4eac1e3082c5228be42babc0eaa21da9a8c35ee052b"} Oct 08 18:24:46 crc kubenswrapper[4750]: I1008 18:24:46.394643 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerStarted","Data":"38b4de4f0388b413fd4ae024684835922f6f6d686815019b1c58aa36b656c45c"} Oct 08 18:24:46 crc kubenswrapper[4750]: I1008 18:24:46.394744 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerStarted","Data":"859932973c26667d2c8f724c969e0e08500d7559c96c40b6decce8126609f24b"} Oct 08 18:24:46 crc kubenswrapper[4750]: I1008 18:24:46.394849 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerStarted","Data":"da4aca5d0f62b863e436fa815cd065ea9737f829682d0789af0890a9a052be52"} Oct 08 18:24:47 crc kubenswrapper[4750]: I1008 18:24:47.412722 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4z9bv" event={"ID":"c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1","Type":"ContainerStarted","Data":"306ebe10c15d67654e3d004a0cc22de8598956ba6b52acdad4b4be9f086b7151"} Oct 08 18:24:47 crc kubenswrapper[4750]: I1008 18:24:47.413137 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:47 crc kubenswrapper[4750]: I1008 18:24:47.445337 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-4z9bv" podStartSLOduration=7.092830084 podStartE2EDuration="14.445291784s" podCreationTimestamp="2025-10-08 18:24:33 +0000 UTC" firstStartedPulling="2025-10-08 18:24:35.088676546 +0000 UTC m=+831.001647569" lastFinishedPulling="2025-10-08 18:24:42.441138256 +0000 UTC m=+838.354109269" observedRunningTime="2025-10-08 18:24:47.440988033 +0000 UTC m=+843.353959066" watchObservedRunningTime="2025-10-08 18:24:47.445291784 +0000 UTC m=+843.358262797" Oct 08 18:24:49 crc kubenswrapper[4750]: I1008 18:24:49.879288 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:49 crc kubenswrapper[4750]: I1008 18:24:49.918380 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:24:54 crc kubenswrapper[4750]: I1008 18:24:54.732176 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-8dgbs" Oct 08 18:24:54 crc kubenswrapper[4750]: I1008 18:24:54.893930 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xwpph" Oct 08 18:24:55 crc kubenswrapper[4750]: I1008 18:24:55.865576 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-sqjmd" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.588159 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw"] Oct 08 18:24:57 crc kubenswrapper[4750]: E1008 18:24:57.588433 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerName="extract-content" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.588448 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerName="extract-content" Oct 08 18:24:57 crc kubenswrapper[4750]: E1008 18:24:57.588481 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerName="registry-server" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.588489 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerName="registry-server" Oct 08 18:24:57 crc kubenswrapper[4750]: E1008 18:24:57.588498 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerName="extract-utilities" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.588507 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerName="extract-utilities" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.588647 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b97c98a-f56a-4654-a44b-f7c1ba5f006c" containerName="registry-server" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.589635 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.595126 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.606189 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw"] Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.766656 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmj9t\" (UniqueName: \"kubernetes.io/projected/3911a0af-c9a8-477b-b9d1-77fac1ed4441-kube-api-access-hmj9t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.766743 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.766771 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.868246 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmj9t\" (UniqueName: \"kubernetes.io/projected/3911a0af-c9a8-477b-b9d1-77fac1ed4441-kube-api-access-hmj9t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.868314 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.868336 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.868785 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.868866 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.887050 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmj9t\" (UniqueName: \"kubernetes.io/projected/3911a0af-c9a8-477b-b9d1-77fac1ed4441-kube-api-access-hmj9t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:57 crc kubenswrapper[4750]: I1008 18:24:57.906493 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:24:58 crc kubenswrapper[4750]: I1008 18:24:58.350840 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw"] Oct 08 18:24:58 crc kubenswrapper[4750]: I1008 18:24:58.476918 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" event={"ID":"3911a0af-c9a8-477b-b9d1-77fac1ed4441","Type":"ContainerStarted","Data":"9e40d2271c9eae7acf44dbb3d98b9876d7b063fceb3d4b7397acf697226e4504"} Oct 08 18:24:59 crc kubenswrapper[4750]: I1008 18:24:59.484148 4750 generic.go:334] "Generic (PLEG): container finished" podID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerID="36968bd683e9fe44a1cee5b9573b90ce8aabdca9850cec95abe745eb5efb3a3c" exitCode=0 Oct 08 18:24:59 crc kubenswrapper[4750]: I1008 18:24:59.484218 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" event={"ID":"3911a0af-c9a8-477b-b9d1-77fac1ed4441","Type":"ContainerDied","Data":"36968bd683e9fe44a1cee5b9573b90ce8aabdca9850cec95abe745eb5efb3a3c"} Oct 08 18:25:04 crc kubenswrapper[4750]: I1008 18:25:04.881186 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-4z9bv" Oct 08 18:25:04 crc kubenswrapper[4750]: E1008 18:25:04.947396 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/systemd-tmpfiles-clean.service\": RecentStats: unable to find data in memory cache]" Oct 08 18:25:06 crc kubenswrapper[4750]: I1008 18:25:06.524899 4750 generic.go:334] "Generic (PLEG): container finished" podID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerID="3fa7059803e9235efe0004a6906b832dcda22c8e66f0c78fc0767682c00b69ea" exitCode=0 Oct 08 18:25:06 crc kubenswrapper[4750]: I1008 18:25:06.524999 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" event={"ID":"3911a0af-c9a8-477b-b9d1-77fac1ed4441","Type":"ContainerDied","Data":"3fa7059803e9235efe0004a6906b832dcda22c8e66f0c78fc0767682c00b69ea"} Oct 08 18:25:07 crc kubenswrapper[4750]: I1008 18:25:07.533567 4750 generic.go:334] "Generic (PLEG): container finished" podID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerID="8137b80409b25109d07bb708fdc9afb718d08faabce0db97f2aef790f7ac7875" exitCode=0 Oct 08 18:25:07 crc kubenswrapper[4750]: I1008 18:25:07.533651 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" event={"ID":"3911a0af-c9a8-477b-b9d1-77fac1ed4441","Type":"ContainerDied","Data":"8137b80409b25109d07bb708fdc9afb718d08faabce0db97f2aef790f7ac7875"} Oct 08 18:25:08 crc kubenswrapper[4750]: I1008 18:25:08.797992 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:25:08 crc kubenswrapper[4750]: I1008 18:25:08.915720 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmj9t\" (UniqueName: \"kubernetes.io/projected/3911a0af-c9a8-477b-b9d1-77fac1ed4441-kube-api-access-hmj9t\") pod \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " Oct 08 18:25:08 crc kubenswrapper[4750]: I1008 18:25:08.915820 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-bundle\") pod \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " Oct 08 18:25:08 crc kubenswrapper[4750]: I1008 18:25:08.915980 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-util\") pod \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\" (UID: \"3911a0af-c9a8-477b-b9d1-77fac1ed4441\") " Oct 08 18:25:08 crc kubenswrapper[4750]: I1008 18:25:08.917960 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-bundle" (OuterVolumeSpecName: "bundle") pod "3911a0af-c9a8-477b-b9d1-77fac1ed4441" (UID: "3911a0af-c9a8-477b-b9d1-77fac1ed4441"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:25:08 crc kubenswrapper[4750]: I1008 18:25:08.924371 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3911a0af-c9a8-477b-b9d1-77fac1ed4441-kube-api-access-hmj9t" (OuterVolumeSpecName: "kube-api-access-hmj9t") pod "3911a0af-c9a8-477b-b9d1-77fac1ed4441" (UID: "3911a0af-c9a8-477b-b9d1-77fac1ed4441"). InnerVolumeSpecName "kube-api-access-hmj9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:25:08 crc kubenswrapper[4750]: I1008 18:25:08.930206 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-util" (OuterVolumeSpecName: "util") pod "3911a0af-c9a8-477b-b9d1-77fac1ed4441" (UID: "3911a0af-c9a8-477b-b9d1-77fac1ed4441"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:25:09 crc kubenswrapper[4750]: I1008 18:25:09.018114 4750 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-util\") on node \"crc\" DevicePath \"\"" Oct 08 18:25:09 crc kubenswrapper[4750]: I1008 18:25:09.018223 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmj9t\" (UniqueName: \"kubernetes.io/projected/3911a0af-c9a8-477b-b9d1-77fac1ed4441-kube-api-access-hmj9t\") on node \"crc\" DevicePath \"\"" Oct 08 18:25:09 crc kubenswrapper[4750]: I1008 18:25:09.018243 4750 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3911a0af-c9a8-477b-b9d1-77fac1ed4441-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:25:09 crc kubenswrapper[4750]: I1008 18:25:09.547499 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" event={"ID":"3911a0af-c9a8-477b-b9d1-77fac1ed4441","Type":"ContainerDied","Data":"9e40d2271c9eae7acf44dbb3d98b9876d7b063fceb3d4b7397acf697226e4504"} Oct 08 18:25:09 crc kubenswrapper[4750]: I1008 18:25:09.547568 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e40d2271c9eae7acf44dbb3d98b9876d7b063fceb3d4b7397acf697226e4504" Oct 08 18:25:09 crc kubenswrapper[4750]: I1008 18:25:09.547602 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.528341 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz"] Oct 08 18:25:15 crc kubenswrapper[4750]: E1008 18:25:15.529173 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerName="extract" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.529191 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerName="extract" Oct 08 18:25:15 crc kubenswrapper[4750]: E1008 18:25:15.529201 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerName="pull" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.529209 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerName="pull" Oct 08 18:25:15 crc kubenswrapper[4750]: E1008 18:25:15.529231 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerName="util" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.529239 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerName="util" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.529372 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="3911a0af-c9a8-477b-b9d1-77fac1ed4441" containerName="extract" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.529880 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.532094 4750 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-x9szk" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.532486 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.536596 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.554125 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz"] Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.706634 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckw7\" (UniqueName: \"kubernetes.io/projected/5fea6a79-4a87-4ee6-b49e-b86b374eb054-kube-api-access-zckw7\") pod \"cert-manager-operator-controller-manager-57cd46d6d-5bkgz\" (UID: \"5fea6a79-4a87-4ee6-b49e-b86b374eb054\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.807515 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zckw7\" (UniqueName: \"kubernetes.io/projected/5fea6a79-4a87-4ee6-b49e-b86b374eb054-kube-api-access-zckw7\") pod \"cert-manager-operator-controller-manager-57cd46d6d-5bkgz\" (UID: \"5fea6a79-4a87-4ee6-b49e-b86b374eb054\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.826835 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckw7\" (UniqueName: \"kubernetes.io/projected/5fea6a79-4a87-4ee6-b49e-b86b374eb054-kube-api-access-zckw7\") pod \"cert-manager-operator-controller-manager-57cd46d6d-5bkgz\" (UID: \"5fea6a79-4a87-4ee6-b49e-b86b374eb054\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz" Oct 08 18:25:15 crc kubenswrapper[4750]: I1008 18:25:15.844687 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz" Oct 08 18:25:16 crc kubenswrapper[4750]: I1008 18:25:16.325620 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz"] Oct 08 18:25:16 crc kubenswrapper[4750]: I1008 18:25:16.598112 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz" event={"ID":"5fea6a79-4a87-4ee6-b49e-b86b374eb054","Type":"ContainerStarted","Data":"6ff9ac7c4dff8a3b70296ac9b4750e357722c949017494c4a43b4c6bde8663e9"} Oct 08 18:25:23 crc kubenswrapper[4750]: I1008 18:25:23.645187 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz" event={"ID":"5fea6a79-4a87-4ee6-b49e-b86b374eb054","Type":"ContainerStarted","Data":"dbcd1b14070f781c254c219e6076d5bc6870a2c3b357e569ad323acd893701ed"} Oct 08 18:25:24 crc kubenswrapper[4750]: I1008 18:25:24.671594 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-5bkgz" podStartSLOduration=2.5790920760000002 podStartE2EDuration="9.671570625s" podCreationTimestamp="2025-10-08 18:25:15 +0000 UTC" firstStartedPulling="2025-10-08 18:25:16.345247358 +0000 UTC m=+872.258218371" lastFinishedPulling="2025-10-08 18:25:23.437725907 +0000 UTC m=+879.350696920" observedRunningTime="2025-10-08 18:25:24.668100982 +0000 UTC m=+880.581072015" watchObservedRunningTime="2025-10-08 18:25:24.671570625 +0000 UTC m=+880.584541658" Oct 08 18:25:26 crc kubenswrapper[4750]: I1008 18:25:26.943870 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-zjr7p"] Oct 08 18:25:26 crc kubenswrapper[4750]: I1008 18:25:26.944852 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:26 crc kubenswrapper[4750]: I1008 18:25:26.947698 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 08 18:25:26 crc kubenswrapper[4750]: I1008 18:25:26.947856 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 08 18:25:26 crc kubenswrapper[4750]: I1008 18:25:26.948602 4750 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-559k8" Oct 08 18:25:26 crc kubenswrapper[4750]: I1008 18:25:26.961718 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-zjr7p"] Oct 08 18:25:26 crc kubenswrapper[4750]: I1008 18:25:26.973337 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-892cz\" (UniqueName: \"kubernetes.io/projected/a29a394f-e073-44c0-a666-9370883743bf-kube-api-access-892cz\") pod \"cert-manager-webhook-d969966f-zjr7p\" (UID: \"a29a394f-e073-44c0-a666-9370883743bf\") " pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:26 crc kubenswrapper[4750]: I1008 18:25:26.973421 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a29a394f-e073-44c0-a666-9370883743bf-bound-sa-token\") pod \"cert-manager-webhook-d969966f-zjr7p\" (UID: \"a29a394f-e073-44c0-a666-9370883743bf\") " pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:27 crc kubenswrapper[4750]: I1008 18:25:27.075122 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-892cz\" (UniqueName: \"kubernetes.io/projected/a29a394f-e073-44c0-a666-9370883743bf-kube-api-access-892cz\") pod \"cert-manager-webhook-d969966f-zjr7p\" (UID: \"a29a394f-e073-44c0-a666-9370883743bf\") " pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:27 crc kubenswrapper[4750]: I1008 18:25:27.075193 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a29a394f-e073-44c0-a666-9370883743bf-bound-sa-token\") pod \"cert-manager-webhook-d969966f-zjr7p\" (UID: \"a29a394f-e073-44c0-a666-9370883743bf\") " pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:27 crc kubenswrapper[4750]: I1008 18:25:27.097571 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a29a394f-e073-44c0-a666-9370883743bf-bound-sa-token\") pod \"cert-manager-webhook-d969966f-zjr7p\" (UID: \"a29a394f-e073-44c0-a666-9370883743bf\") " pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:27 crc kubenswrapper[4750]: I1008 18:25:27.109792 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-892cz\" (UniqueName: \"kubernetes.io/projected/a29a394f-e073-44c0-a666-9370883743bf-kube-api-access-892cz\") pod \"cert-manager-webhook-d969966f-zjr7p\" (UID: \"a29a394f-e073-44c0-a666-9370883743bf\") " pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:27 crc kubenswrapper[4750]: I1008 18:25:27.264282 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:27 crc kubenswrapper[4750]: I1008 18:25:27.692605 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-zjr7p"] Oct 08 18:25:27 crc kubenswrapper[4750]: W1008 18:25:27.704747 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda29a394f_e073_44c0_a666_9370883743bf.slice/crio-a13bf60df0fb879bac425b4babf8f76b695e38922bd6b74ed27dfeeb238c6f48 WatchSource:0}: Error finding container a13bf60df0fb879bac425b4babf8f76b695e38922bd6b74ed27dfeeb238c6f48: Status 404 returned error can't find the container with id a13bf60df0fb879bac425b4babf8f76b695e38922bd6b74ed27dfeeb238c6f48 Oct 08 18:25:28 crc kubenswrapper[4750]: I1008 18:25:28.675748 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" event={"ID":"a29a394f-e073-44c0-a666-9370883743bf","Type":"ContainerStarted","Data":"a13bf60df0fb879bac425b4babf8f76b695e38922bd6b74ed27dfeeb238c6f48"} Oct 08 18:25:29 crc kubenswrapper[4750]: I1008 18:25:29.706896 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:25:29 crc kubenswrapper[4750]: I1008 18:25:29.706973 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.117921 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2"] Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.118792 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.121570 4750 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bzbrg" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.124152 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2"] Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.269356 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8177296f-6138-431e-808b-7f4b643a2a90-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-wg2g2\" (UID: \"8177296f-6138-431e-808b-7f4b643a2a90\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.269703 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bv4d\" (UniqueName: \"kubernetes.io/projected/8177296f-6138-431e-808b-7f4b643a2a90-kube-api-access-4bv4d\") pod \"cert-manager-cainjector-7d9f95dbf-wg2g2\" (UID: \"8177296f-6138-431e-808b-7f4b643a2a90\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.370299 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8177296f-6138-431e-808b-7f4b643a2a90-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-wg2g2\" (UID: \"8177296f-6138-431e-808b-7f4b643a2a90\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.370360 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bv4d\" (UniqueName: \"kubernetes.io/projected/8177296f-6138-431e-808b-7f4b643a2a90-kube-api-access-4bv4d\") pod \"cert-manager-cainjector-7d9f95dbf-wg2g2\" (UID: \"8177296f-6138-431e-808b-7f4b643a2a90\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.389114 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bv4d\" (UniqueName: \"kubernetes.io/projected/8177296f-6138-431e-808b-7f4b643a2a90-kube-api-access-4bv4d\") pod \"cert-manager-cainjector-7d9f95dbf-wg2g2\" (UID: \"8177296f-6138-431e-808b-7f4b643a2a90\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.395236 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8177296f-6138-431e-808b-7f4b643a2a90-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-wg2g2\" (UID: \"8177296f-6138-431e-808b-7f4b643a2a90\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" Oct 08 18:25:31 crc kubenswrapper[4750]: I1008 18:25:31.440973 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" Oct 08 18:25:32 crc kubenswrapper[4750]: I1008 18:25:32.994968 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2"] Oct 08 18:25:33 crc kubenswrapper[4750]: I1008 18:25:33.726352 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" event={"ID":"8177296f-6138-431e-808b-7f4b643a2a90","Type":"ContainerStarted","Data":"63ad2a5c6cab90300db3e190cc9d4c8d9ba8bf7e9cadc32ed4fe66286a116a5c"} Oct 08 18:25:34 crc kubenswrapper[4750]: I1008 18:25:34.745293 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" event={"ID":"a29a394f-e073-44c0-a666-9370883743bf","Type":"ContainerStarted","Data":"83f39459a3916b783c0d9721e560f396b422c37dff396b4ae9b35d17f24a2305"} Oct 08 18:25:34 crc kubenswrapper[4750]: I1008 18:25:34.745349 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:34 crc kubenswrapper[4750]: I1008 18:25:34.745365 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" event={"ID":"8177296f-6138-431e-808b-7f4b643a2a90","Type":"ContainerStarted","Data":"9b7adae2565ffda534ebbbd5507734295c0b401889c6936381fd0ee2c56c0791"} Oct 08 18:25:34 crc kubenswrapper[4750]: I1008 18:25:34.780585 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-wg2g2" podStartSLOduration=3.780566457 podStartE2EDuration="3.780566457s" podCreationTimestamp="2025-10-08 18:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:25:34.778493167 +0000 UTC m=+890.691464180" watchObservedRunningTime="2025-10-08 18:25:34.780566457 +0000 UTC m=+890.693537470" Oct 08 18:25:34 crc kubenswrapper[4750]: I1008 18:25:34.795855 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" podStartSLOduration=3.041892304 podStartE2EDuration="8.795835433s" podCreationTimestamp="2025-10-08 18:25:26 +0000 UTC" firstStartedPulling="2025-10-08 18:25:27.706354076 +0000 UTC m=+883.619325089" lastFinishedPulling="2025-10-08 18:25:33.460297205 +0000 UTC m=+889.373268218" observedRunningTime="2025-10-08 18:25:34.793859766 +0000 UTC m=+890.706830779" watchObservedRunningTime="2025-10-08 18:25:34.795835433 +0000 UTC m=+890.708806436" Oct 08 18:25:42 crc kubenswrapper[4750]: I1008 18:25:42.267186 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-zjr7p" Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.730314 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5pnf"] Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.739181 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.741050 4750 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vhj4d" Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.741445 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5pnf"] Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.859590 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94b58e9d-c56f-4476-9734-9a12e840e26b-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5pnf\" (UID: \"94b58e9d-c56f-4476-9734-9a12e840e26b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.859657 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfr68\" (UniqueName: \"kubernetes.io/projected/94b58e9d-c56f-4476-9734-9a12e840e26b-kube-api-access-xfr68\") pod \"cert-manager-7d4cc89fcb-g5pnf\" (UID: \"94b58e9d-c56f-4476-9734-9a12e840e26b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.961606 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94b58e9d-c56f-4476-9734-9a12e840e26b-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5pnf\" (UID: \"94b58e9d-c56f-4476-9734-9a12e840e26b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.961685 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfr68\" (UniqueName: \"kubernetes.io/projected/94b58e9d-c56f-4476-9734-9a12e840e26b-kube-api-access-xfr68\") pod \"cert-manager-7d4cc89fcb-g5pnf\" (UID: \"94b58e9d-c56f-4476-9734-9a12e840e26b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.981369 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfr68\" (UniqueName: \"kubernetes.io/projected/94b58e9d-c56f-4476-9734-9a12e840e26b-kube-api-access-xfr68\") pod \"cert-manager-7d4cc89fcb-g5pnf\" (UID: \"94b58e9d-c56f-4476-9734-9a12e840e26b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" Oct 08 18:25:45 crc kubenswrapper[4750]: I1008 18:25:45.982583 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94b58e9d-c56f-4476-9734-9a12e840e26b-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5pnf\" (UID: \"94b58e9d-c56f-4476-9734-9a12e840e26b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" Oct 08 18:25:46 crc kubenswrapper[4750]: I1008 18:25:46.060120 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" Oct 08 18:25:46 crc kubenswrapper[4750]: I1008 18:25:46.498287 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5pnf"] Oct 08 18:25:46 crc kubenswrapper[4750]: I1008 18:25:46.805165 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" event={"ID":"94b58e9d-c56f-4476-9734-9a12e840e26b","Type":"ContainerStarted","Data":"347f2ac29c4a8900d328e939803f2ddd9cb399975725f86036c739b591fb7140"} Oct 08 18:25:46 crc kubenswrapper[4750]: I1008 18:25:46.805478 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" event={"ID":"94b58e9d-c56f-4476-9734-9a12e840e26b","Type":"ContainerStarted","Data":"112abaf8afecfa4dbf88fff791864496fb1325f7d0032a9b5a6ea6d1bad90296"} Oct 08 18:25:46 crc kubenswrapper[4750]: I1008 18:25:46.819412 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-g5pnf" podStartSLOduration=1.819390464 podStartE2EDuration="1.819390464s" podCreationTimestamp="2025-10-08 18:25:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:25:46.81630542 +0000 UTC m=+902.729276433" watchObservedRunningTime="2025-10-08 18:25:46.819390464 +0000 UTC m=+902.732361477" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.055730 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5j6sl"] Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.057319 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5j6sl" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.062903 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-wwt9t" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.062919 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.062928 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.068127 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5j6sl"] Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.203669 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zzg\" (UniqueName: \"kubernetes.io/projected/94099d10-d29c-4917-82e3-64d211cbba8b-kube-api-access-b4zzg\") pod \"openstack-operator-index-5j6sl\" (UID: \"94099d10-d29c-4917-82e3-64d211cbba8b\") " pod="openstack-operators/openstack-operator-index-5j6sl" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.304756 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zzg\" (UniqueName: \"kubernetes.io/projected/94099d10-d29c-4917-82e3-64d211cbba8b-kube-api-access-b4zzg\") pod \"openstack-operator-index-5j6sl\" (UID: \"94099d10-d29c-4917-82e3-64d211cbba8b\") " pod="openstack-operators/openstack-operator-index-5j6sl" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.323445 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zzg\" (UniqueName: \"kubernetes.io/projected/94099d10-d29c-4917-82e3-64d211cbba8b-kube-api-access-b4zzg\") pod \"openstack-operator-index-5j6sl\" (UID: \"94099d10-d29c-4917-82e3-64d211cbba8b\") " pod="openstack-operators/openstack-operator-index-5j6sl" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.382253 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5j6sl" Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.797890 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5j6sl"] Oct 08 18:25:56 crc kubenswrapper[4750]: I1008 18:25:56.865617 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5j6sl" event={"ID":"94099d10-d29c-4917-82e3-64d211cbba8b","Type":"ContainerStarted","Data":"92a26d060d2908af39605d77242c5cc6d0787d8ba9f1bd4f50a00801466b930e"} Oct 08 18:25:57 crc kubenswrapper[4750]: I1008 18:25:57.873234 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5j6sl" event={"ID":"94099d10-d29c-4917-82e3-64d211cbba8b","Type":"ContainerStarted","Data":"886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3"} Oct 08 18:25:59 crc kubenswrapper[4750]: I1008 18:25:59.707425 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:25:59 crc kubenswrapper[4750]: I1008 18:25:59.707806 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:25:59 crc kubenswrapper[4750]: I1008 18:25:59.828334 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5j6sl" podStartSLOduration=2.980804224 podStartE2EDuration="3.828313295s" podCreationTimestamp="2025-10-08 18:25:56 +0000 UTC" firstStartedPulling="2025-10-08 18:25:56.805373877 +0000 UTC m=+912.718344890" lastFinishedPulling="2025-10-08 18:25:57.652882948 +0000 UTC m=+913.565853961" observedRunningTime="2025-10-08 18:25:57.890835696 +0000 UTC m=+913.803806709" watchObservedRunningTime="2025-10-08 18:25:59.828313295 +0000 UTC m=+915.741284308" Oct 08 18:25:59 crc kubenswrapper[4750]: I1008 18:25:59.829124 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5j6sl"] Oct 08 18:25:59 crc kubenswrapper[4750]: I1008 18:25:59.899757 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5j6sl" podUID="94099d10-d29c-4917-82e3-64d211cbba8b" containerName="registry-server" containerID="cri-o://886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3" gracePeriod=2 Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.232513 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5j6sl" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.352345 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4zzg\" (UniqueName: \"kubernetes.io/projected/94099d10-d29c-4917-82e3-64d211cbba8b-kube-api-access-b4zzg\") pod \"94099d10-d29c-4917-82e3-64d211cbba8b\" (UID: \"94099d10-d29c-4917-82e3-64d211cbba8b\") " Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.358185 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94099d10-d29c-4917-82e3-64d211cbba8b-kube-api-access-b4zzg" (OuterVolumeSpecName: "kube-api-access-b4zzg") pod "94099d10-d29c-4917-82e3-64d211cbba8b" (UID: "94099d10-d29c-4917-82e3-64d211cbba8b"). InnerVolumeSpecName "kube-api-access-b4zzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.454456 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4zzg\" (UniqueName: \"kubernetes.io/projected/94099d10-d29c-4917-82e3-64d211cbba8b-kube-api-access-b4zzg\") on node \"crc\" DevicePath \"\"" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.633154 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wvq8j"] Oct 08 18:26:00 crc kubenswrapper[4750]: E1008 18:26:00.633393 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94099d10-d29c-4917-82e3-64d211cbba8b" containerName="registry-server" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.633405 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="94099d10-d29c-4917-82e3-64d211cbba8b" containerName="registry-server" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.633516 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="94099d10-d29c-4917-82e3-64d211cbba8b" containerName="registry-server" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.633915 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.646579 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wvq8j"] Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.758191 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sngql\" (UniqueName: \"kubernetes.io/projected/0a3ef1a8-10f5-4292-84de-04bc8c2cff44-kube-api-access-sngql\") pod \"openstack-operator-index-wvq8j\" (UID: \"0a3ef1a8-10f5-4292-84de-04bc8c2cff44\") " pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.859916 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sngql\" (UniqueName: \"kubernetes.io/projected/0a3ef1a8-10f5-4292-84de-04bc8c2cff44-kube-api-access-sngql\") pod \"openstack-operator-index-wvq8j\" (UID: \"0a3ef1a8-10f5-4292-84de-04bc8c2cff44\") " pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.876643 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sngql\" (UniqueName: \"kubernetes.io/projected/0a3ef1a8-10f5-4292-84de-04bc8c2cff44-kube-api-access-sngql\") pod \"openstack-operator-index-wvq8j\" (UID: \"0a3ef1a8-10f5-4292-84de-04bc8c2cff44\") " pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.907250 4750 generic.go:334] "Generic (PLEG): container finished" podID="94099d10-d29c-4917-82e3-64d211cbba8b" containerID="886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3" exitCode=0 Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.907317 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5j6sl" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.907323 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5j6sl" event={"ID":"94099d10-d29c-4917-82e3-64d211cbba8b","Type":"ContainerDied","Data":"886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3"} Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.907575 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5j6sl" event={"ID":"94099d10-d29c-4917-82e3-64d211cbba8b","Type":"ContainerDied","Data":"92a26d060d2908af39605d77242c5cc6d0787d8ba9f1bd4f50a00801466b930e"} Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.907620 4750 scope.go:117] "RemoveContainer" containerID="886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.925510 4750 scope.go:117] "RemoveContainer" containerID="886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3" Oct 08 18:26:00 crc kubenswrapper[4750]: E1008 18:26:00.926075 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3\": container with ID starting with 886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3 not found: ID does not exist" containerID="886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.926153 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3"} err="failed to get container status \"886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3\": rpc error: code = NotFound desc = could not find container \"886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3\": container with ID starting with 886e239a4883bc04b5fd22b2e57ffb9c7693de33fc748819efe5d644492bb6b3 not found: ID does not exist" Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.928191 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5j6sl"] Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.931624 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5j6sl"] Oct 08 18:26:00 crc kubenswrapper[4750]: I1008 18:26:00.969007 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:01 crc kubenswrapper[4750]: I1008 18:26:01.161777 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wvq8j"] Oct 08 18:26:01 crc kubenswrapper[4750]: I1008 18:26:01.916314 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wvq8j" event={"ID":"0a3ef1a8-10f5-4292-84de-04bc8c2cff44","Type":"ContainerStarted","Data":"a75f808b12412db01cd6b2343f97bd5e088b684bfd6c117c6fd5ff63a8e02d19"} Oct 08 18:26:02 crc kubenswrapper[4750]: I1008 18:26:02.748361 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94099d10-d29c-4917-82e3-64d211cbba8b" path="/var/lib/kubelet/pods/94099d10-d29c-4917-82e3-64d211cbba8b/volumes" Oct 08 18:26:02 crc kubenswrapper[4750]: I1008 18:26:02.925013 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wvq8j" event={"ID":"0a3ef1a8-10f5-4292-84de-04bc8c2cff44","Type":"ContainerStarted","Data":"4bf7ebc2b6b623c1c5b38ca1b00afb68dd8a672699d676761a4a16d80db42362"} Oct 08 18:26:10 crc kubenswrapper[4750]: I1008 18:26:10.969154 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:10 crc kubenswrapper[4750]: I1008 18:26:10.969844 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:10 crc kubenswrapper[4750]: I1008 18:26:10.998983 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:11 crc kubenswrapper[4750]: I1008 18:26:11.012812 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wvq8j" podStartSLOduration=9.846587932 podStartE2EDuration="11.012792208s" podCreationTimestamp="2025-10-08 18:26:00 +0000 UTC" firstStartedPulling="2025-10-08 18:26:01.177378147 +0000 UTC m=+917.090349160" lastFinishedPulling="2025-10-08 18:26:02.343582423 +0000 UTC m=+918.256553436" observedRunningTime="2025-10-08 18:26:02.943277089 +0000 UTC m=+918.856248122" watchObservedRunningTime="2025-10-08 18:26:11.012792208 +0000 UTC m=+926.925763221" Oct 08 18:26:12 crc kubenswrapper[4750]: I1008 18:26:12.003824 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wvq8j" Oct 08 18:26:14 crc kubenswrapper[4750]: I1008 18:26:14.859978 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl"] Oct 08 18:26:14 crc kubenswrapper[4750]: I1008 18:26:14.861580 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:14 crc kubenswrapper[4750]: I1008 18:26:14.864279 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nzl7v" Oct 08 18:26:14 crc kubenswrapper[4750]: I1008 18:26:14.881616 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl"] Oct 08 18:26:14 crc kubenswrapper[4750]: I1008 18:26:14.962955 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxht9\" (UniqueName: \"kubernetes.io/projected/f62fe45d-4c16-42a3-8445-075a733b6a13-kube-api-access-pxht9\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:14 crc kubenswrapper[4750]: I1008 18:26:14.963012 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-util\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:14 crc kubenswrapper[4750]: I1008 18:26:14.963181 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-bundle\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:15 crc kubenswrapper[4750]: I1008 18:26:15.065032 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxht9\" (UniqueName: \"kubernetes.io/projected/f62fe45d-4c16-42a3-8445-075a733b6a13-kube-api-access-pxht9\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:15 crc kubenswrapper[4750]: I1008 18:26:15.065081 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-util\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:15 crc kubenswrapper[4750]: I1008 18:26:15.065125 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-bundle\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:15 crc kubenswrapper[4750]: I1008 18:26:15.065583 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-bundle\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:15 crc kubenswrapper[4750]: I1008 18:26:15.066040 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-util\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:15 crc kubenswrapper[4750]: I1008 18:26:15.097747 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxht9\" (UniqueName: \"kubernetes.io/projected/f62fe45d-4c16-42a3-8445-075a733b6a13-kube-api-access-pxht9\") pod \"1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:15 crc kubenswrapper[4750]: I1008 18:26:15.181639 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:15 crc kubenswrapper[4750]: I1008 18:26:15.585205 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl"] Oct 08 18:26:16 crc kubenswrapper[4750]: I1008 18:26:16.002020 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" event={"ID":"f62fe45d-4c16-42a3-8445-075a733b6a13","Type":"ContainerStarted","Data":"6a241fcec774e8ffeab660a212dee05a6c42d0eaca8f04e805f1ff20cdd9b626"} Oct 08 18:26:17 crc kubenswrapper[4750]: I1008 18:26:17.019410 4750 generic.go:334] "Generic (PLEG): container finished" podID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerID="ad503e48d0e5a33ae9bcb65e90160c9d9477d38292f49afb74f4d496a7ce0221" exitCode=0 Oct 08 18:26:17 crc kubenswrapper[4750]: I1008 18:26:17.019627 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" event={"ID":"f62fe45d-4c16-42a3-8445-075a733b6a13","Type":"ContainerDied","Data":"ad503e48d0e5a33ae9bcb65e90160c9d9477d38292f49afb74f4d496a7ce0221"} Oct 08 18:26:18 crc kubenswrapper[4750]: I1008 18:26:18.029672 4750 generic.go:334] "Generic (PLEG): container finished" podID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerID="0496375070f88d0af241e566cd4106b57eec8a09b64da21eaa2fca0130da215e" exitCode=0 Oct 08 18:26:18 crc kubenswrapper[4750]: I1008 18:26:18.029754 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" event={"ID":"f62fe45d-4c16-42a3-8445-075a733b6a13","Type":"ContainerDied","Data":"0496375070f88d0af241e566cd4106b57eec8a09b64da21eaa2fca0130da215e"} Oct 08 18:26:19 crc kubenswrapper[4750]: I1008 18:26:19.039828 4750 generic.go:334] "Generic (PLEG): container finished" podID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerID="0b6659ab9d74728f55cd7cd2cff7e3f02b92f0d7a12df3124c182ff10f30c794" exitCode=0 Oct 08 18:26:19 crc kubenswrapper[4750]: I1008 18:26:19.039937 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" event={"ID":"f62fe45d-4c16-42a3-8445-075a733b6a13","Type":"ContainerDied","Data":"0b6659ab9d74728f55cd7cd2cff7e3f02b92f0d7a12df3124c182ff10f30c794"} Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.338622 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.437889 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-util\") pod \"f62fe45d-4c16-42a3-8445-075a733b6a13\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.438012 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-bundle\") pod \"f62fe45d-4c16-42a3-8445-075a733b6a13\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.438060 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxht9\" (UniqueName: \"kubernetes.io/projected/f62fe45d-4c16-42a3-8445-075a733b6a13-kube-api-access-pxht9\") pod \"f62fe45d-4c16-42a3-8445-075a733b6a13\" (UID: \"f62fe45d-4c16-42a3-8445-075a733b6a13\") " Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.438759 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-bundle" (OuterVolumeSpecName: "bundle") pod "f62fe45d-4c16-42a3-8445-075a733b6a13" (UID: "f62fe45d-4c16-42a3-8445-075a733b6a13"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.445863 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62fe45d-4c16-42a3-8445-075a733b6a13-kube-api-access-pxht9" (OuterVolumeSpecName: "kube-api-access-pxht9") pod "f62fe45d-4c16-42a3-8445-075a733b6a13" (UID: "f62fe45d-4c16-42a3-8445-075a733b6a13"). InnerVolumeSpecName "kube-api-access-pxht9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.471313 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-util" (OuterVolumeSpecName: "util") pod "f62fe45d-4c16-42a3-8445-075a733b6a13" (UID: "f62fe45d-4c16-42a3-8445-075a733b6a13"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.540718 4750 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-util\") on node \"crc\" DevicePath \"\"" Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.540772 4750 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f62fe45d-4c16-42a3-8445-075a733b6a13-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:26:20 crc kubenswrapper[4750]: I1008 18:26:20.540792 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxht9\" (UniqueName: \"kubernetes.io/projected/f62fe45d-4c16-42a3-8445-075a733b6a13-kube-api-access-pxht9\") on node \"crc\" DevicePath \"\"" Oct 08 18:26:21 crc kubenswrapper[4750]: I1008 18:26:21.056876 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" event={"ID":"f62fe45d-4c16-42a3-8445-075a733b6a13","Type":"ContainerDied","Data":"6a241fcec774e8ffeab660a212dee05a6c42d0eaca8f04e805f1ff20cdd9b626"} Oct 08 18:26:21 crc kubenswrapper[4750]: I1008 18:26:21.056937 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a241fcec774e8ffeab660a212dee05a6c42d0eaca8f04e805f1ff20cdd9b626" Oct 08 18:26:21 crc kubenswrapper[4750]: I1008 18:26:21.056985 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl" Oct 08 18:26:23 crc kubenswrapper[4750]: I1008 18:26:23.993624 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn"] Oct 08 18:26:23 crc kubenswrapper[4750]: E1008 18:26:23.995408 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerName="util" Oct 08 18:26:23 crc kubenswrapper[4750]: I1008 18:26:23.995481 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerName="util" Oct 08 18:26:23 crc kubenswrapper[4750]: E1008 18:26:23.995602 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerName="extract" Oct 08 18:26:23 crc kubenswrapper[4750]: I1008 18:26:23.995665 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerName="extract" Oct 08 18:26:23 crc kubenswrapper[4750]: E1008 18:26:23.995742 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerName="pull" Oct 08 18:26:23 crc kubenswrapper[4750]: I1008 18:26:23.995821 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerName="pull" Oct 08 18:26:23 crc kubenswrapper[4750]: I1008 18:26:23.996026 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62fe45d-4c16-42a3-8445-075a733b6a13" containerName="extract" Oct 08 18:26:23 crc kubenswrapper[4750]: I1008 18:26:23.996748 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" Oct 08 18:26:23 crc kubenswrapper[4750]: I1008 18:26:23.998490 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-sj8j4" Oct 08 18:26:24 crc kubenswrapper[4750]: I1008 18:26:24.023897 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn"] Oct 08 18:26:24 crc kubenswrapper[4750]: I1008 18:26:24.109477 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzc5g\" (UniqueName: \"kubernetes.io/projected/f5fcb06d-f818-4f9f-abe4-0f272bdf2681-kube-api-access-kzc5g\") pod \"openstack-operator-controller-operator-bd6bc67fb-f8sjn\" (UID: \"f5fcb06d-f818-4f9f-abe4-0f272bdf2681\") " pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" Oct 08 18:26:24 crc kubenswrapper[4750]: I1008 18:26:24.211039 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzc5g\" (UniqueName: \"kubernetes.io/projected/f5fcb06d-f818-4f9f-abe4-0f272bdf2681-kube-api-access-kzc5g\") pod \"openstack-operator-controller-operator-bd6bc67fb-f8sjn\" (UID: \"f5fcb06d-f818-4f9f-abe4-0f272bdf2681\") " pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" Oct 08 18:26:24 crc kubenswrapper[4750]: I1008 18:26:24.247587 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzc5g\" (UniqueName: \"kubernetes.io/projected/f5fcb06d-f818-4f9f-abe4-0f272bdf2681-kube-api-access-kzc5g\") pod \"openstack-operator-controller-operator-bd6bc67fb-f8sjn\" (UID: \"f5fcb06d-f818-4f9f-abe4-0f272bdf2681\") " pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" Oct 08 18:26:24 crc kubenswrapper[4750]: I1008 18:26:24.316519 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" Oct 08 18:26:24 crc kubenswrapper[4750]: I1008 18:26:24.532650 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn"] Oct 08 18:26:25 crc kubenswrapper[4750]: I1008 18:26:25.079353 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" event={"ID":"f5fcb06d-f818-4f9f-abe4-0f272bdf2681","Type":"ContainerStarted","Data":"8fbb7be2b3656c967a6c058e6d387349ff5f5663c359d0a90ee87bab44eec244"} Oct 08 18:26:29 crc kubenswrapper[4750]: I1008 18:26:29.707491 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:26:29 crc kubenswrapper[4750]: I1008 18:26:29.707979 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:26:29 crc kubenswrapper[4750]: I1008 18:26:29.708017 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:26:29 crc kubenswrapper[4750]: I1008 18:26:29.708526 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dee2d35a9b3eba166b103d6a720e7cbf72b0876a67fbdc37629a8900d4d09d57"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:26:29 crc kubenswrapper[4750]: I1008 18:26:29.708602 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://dee2d35a9b3eba166b103d6a720e7cbf72b0876a67fbdc37629a8900d4d09d57" gracePeriod=600 Oct 08 18:26:30 crc kubenswrapper[4750]: I1008 18:26:30.113309 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="dee2d35a9b3eba166b103d6a720e7cbf72b0876a67fbdc37629a8900d4d09d57" exitCode=0 Oct 08 18:26:30 crc kubenswrapper[4750]: I1008 18:26:30.113524 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"dee2d35a9b3eba166b103d6a720e7cbf72b0876a67fbdc37629a8900d4d09d57"} Oct 08 18:26:30 crc kubenswrapper[4750]: I1008 18:26:30.113665 4750 scope.go:117] "RemoveContainer" containerID="4aa9c398c94477f10b8d76ef065deabe11c48fe9c89856d25a3a57b78914e105" Oct 08 18:26:31 crc kubenswrapper[4750]: I1008 18:26:31.121362 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"a27f8518311deef574465704a4c93c21d7cb4e76fec24d95f01a5d7c9febd08d"} Oct 08 18:26:31 crc kubenswrapper[4750]: I1008 18:26:31.122648 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" event={"ID":"f5fcb06d-f818-4f9f-abe4-0f272bdf2681","Type":"ContainerStarted","Data":"a7b24b161ab3a6c89d87508f842a2809ae9773b01f280a94bb2cd3e164fe3ac2"} Oct 08 18:26:33 crc kubenswrapper[4750]: I1008 18:26:33.135139 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" event={"ID":"f5fcb06d-f818-4f9f-abe4-0f272bdf2681","Type":"ContainerStarted","Data":"2e5933526b1164a35771bdf28387c0cc59a4c7f5e959ae1c53be5dc34e765879"} Oct 08 18:26:33 crc kubenswrapper[4750]: I1008 18:26:33.136516 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" Oct 08 18:26:33 crc kubenswrapper[4750]: I1008 18:26:33.165245 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" podStartSLOduration=2.137454852 podStartE2EDuration="10.165230109s" podCreationTimestamp="2025-10-08 18:26:23 +0000 UTC" firstStartedPulling="2025-10-08 18:26:24.544453686 +0000 UTC m=+940.457424699" lastFinishedPulling="2025-10-08 18:26:32.572228943 +0000 UTC m=+948.485199956" observedRunningTime="2025-10-08 18:26:33.162997915 +0000 UTC m=+949.075968938" watchObservedRunningTime="2025-10-08 18:26:33.165230109 +0000 UTC m=+949.078201132" Oct 08 18:26:35 crc kubenswrapper[4750]: I1008 18:26:35.152876 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-bd6bc67fb-f8sjn" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.264727 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.266061 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.269196 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jlx4x" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.273526 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.274586 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.281721 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5"] Oct 08 18:26:51 crc kubenswrapper[4750]: W1008 18:26:51.282474 4750 reflector.go:561] object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8q5dm": failed to list *v1.Secret: secrets "cinder-operator-controller-manager-dockercfg-8q5dm" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Oct 08 18:26:51 crc kubenswrapper[4750]: E1008 18:26:51.282574 4750 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"cinder-operator-controller-manager-dockercfg-8q5dm\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cinder-operator-controller-manager-dockercfg-8q5dm\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.304925 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.306035 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.307967 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.308335 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5pl7b" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.308977 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.311354 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2p7kp" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.327480 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.334303 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.339523 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.340560 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.342949 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-hbf9j" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.351759 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.361262 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.365973 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.366995 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.369430 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.370408 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.376442 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.378068 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dlhgr" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.378113 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xbf5q" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.378272 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.386272 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsh8v\" (UniqueName: \"kubernetes.io/projected/b22d34d7-5452-4982-96ff-100a0fbdf514-kube-api-access-bsh8v\") pod \"barbican-operator-controller-manager-658bdf4b74-vwnz5\" (UID: \"b22d34d7-5452-4982-96ff-100a0fbdf514\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.386308 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k7gj\" (UniqueName: \"kubernetes.io/projected/07dc6c54-47cf-48e9-aed2-3016f69594de-kube-api-access-8k7gj\") pod \"cinder-operator-controller-manager-7b7fb68549-k8v2z\" (UID: \"07dc6c54-47cf-48e9-aed2-3016f69594de\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.396730 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.401657 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.402697 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.405686 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.406926 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-jdgn7" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.407075 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.408902 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.410398 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6tvms" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.420482 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.441453 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.442678 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.445735 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-h4kct" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.447597 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.460600 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.461570 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.466653 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zh2rk" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.482654 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.509000 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.509787 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv6cf\" (UniqueName: \"kubernetes.io/projected/3e9cf3d8-0e7a-40ce-a32e-25e8700d8307-kube-api-access-rv6cf\") pod \"glance-operator-controller-manager-84b9b84486-pcb44\" (UID: \"3e9cf3d8-0e7a-40ce-a32e-25e8700d8307\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.509879 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbbd9\" (UniqueName: \"kubernetes.io/projected/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-kube-api-access-bbbd9\") pod \"infra-operator-controller-manager-656bcbd775-drhrw\" (UID: \"716ce9a4-ee82-45d0-bb63-fab1eed3c1db\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.509905 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6cm\" (UniqueName: \"kubernetes.io/projected/6b0569ca-c2be-4209-bce5-79feea80203c-kube-api-access-nt6cm\") pod \"horizon-operator-controller-manager-7ffbcb7588-nm5np\" (UID: \"6b0569ca-c2be-4209-bce5-79feea80203c\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.509936 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75f2c\" (UniqueName: \"kubernetes.io/projected/3e418980-7ef9-45e8-8d7d-5bd866af0d26-kube-api-access-75f2c\") pod \"heat-operator-controller-manager-858f76bbdd-km9s8\" (UID: \"3e418980-7ef9-45e8-8d7d-5bd866af0d26\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.509956 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkpft\" (UniqueName: \"kubernetes.io/projected/b41e76ff-471c-4874-88ab-66b6fae3a84a-kube-api-access-hkpft\") pod \"designate-operator-controller-manager-85d5d9dd78-2flss\" (UID: \"b41e76ff-471c-4874-88ab-66b6fae3a84a\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.510057 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert\") pod \"infra-operator-controller-manager-656bcbd775-drhrw\" (UID: \"716ce9a4-ee82-45d0-bb63-fab1eed3c1db\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.510089 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsh8v\" (UniqueName: \"kubernetes.io/projected/b22d34d7-5452-4982-96ff-100a0fbdf514-kube-api-access-bsh8v\") pod \"barbican-operator-controller-manager-658bdf4b74-vwnz5\" (UID: \"b22d34d7-5452-4982-96ff-100a0fbdf514\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.510112 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k7gj\" (UniqueName: \"kubernetes.io/projected/07dc6c54-47cf-48e9-aed2-3016f69594de-kube-api-access-8k7gj\") pod \"cinder-operator-controller-manager-7b7fb68549-k8v2z\" (UID: \"07dc6c54-47cf-48e9-aed2-3016f69594de\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.520048 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.522200 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.522773 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5jb5x" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.542689 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5l7v4" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.548422 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.559334 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k7gj\" (UniqueName: \"kubernetes.io/projected/07dc6c54-47cf-48e9-aed2-3016f69594de-kube-api-access-8k7gj\") pod \"cinder-operator-controller-manager-7b7fb68549-k8v2z\" (UID: \"07dc6c54-47cf-48e9-aed2-3016f69594de\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.575004 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsh8v\" (UniqueName: \"kubernetes.io/projected/b22d34d7-5452-4982-96ff-100a0fbdf514-kube-api-access-bsh8v\") pod \"barbican-operator-controller-manager-658bdf4b74-vwnz5\" (UID: \"b22d34d7-5452-4982-96ff-100a0fbdf514\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.585860 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.586261 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.591720 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.610588 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612303 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vtr\" (UniqueName: \"kubernetes.io/projected/47b84278-c660-43eb-8c7d-686ceed80afd-kube-api-access-99vtr\") pod \"ironic-operator-controller-manager-9c5c78d49-s454n\" (UID: \"47b84278-c660-43eb-8c7d-686ceed80afd\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612371 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert\") pod \"infra-operator-controller-manager-656bcbd775-drhrw\" (UID: \"716ce9a4-ee82-45d0-bb63-fab1eed3c1db\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612401 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxbl8\" (UniqueName: \"kubernetes.io/projected/1f01efe4-3233-44d1-8def-b5080cc50aac-kube-api-access-fxbl8\") pod \"manila-operator-controller-manager-5f67fbc655-prrpn\" (UID: \"1f01efe4-3233-44d1-8def-b5080cc50aac\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612422 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9g4\" (UniqueName: \"kubernetes.io/projected/6bc7e757-dd1c-4334-b83e-c4bc73e96658-kube-api-access-pv9g4\") pod \"keystone-operator-controller-manager-55b6b7c7b8-f857v\" (UID: \"6bc7e757-dd1c-4334-b83e-c4bc73e96658\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612471 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv6cf\" (UniqueName: \"kubernetes.io/projected/3e9cf3d8-0e7a-40ce-a32e-25e8700d8307-kube-api-access-rv6cf\") pod \"glance-operator-controller-manager-84b9b84486-pcb44\" (UID: \"3e9cf3d8-0e7a-40ce-a32e-25e8700d8307\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612491 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxtxz\" (UniqueName: \"kubernetes.io/projected/e03ff9b9-1896-4759-8bf2-66cb9e1a5a55-kube-api-access-cxtxz\") pod \"mariadb-operator-controller-manager-f9fb45f8f-k6wbx\" (UID: \"e03ff9b9-1896-4759-8bf2-66cb9e1a5a55\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612511 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5knx8\" (UniqueName: \"kubernetes.io/projected/280ba6f4-ec1f-4afe-8a1a-db5f030495ae-kube-api-access-5knx8\") pod \"neutron-operator-controller-manager-79d585cb66-9wgqj\" (UID: \"280ba6f4-ec1f-4afe-8a1a-db5f030495ae\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612558 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbbd9\" (UniqueName: \"kubernetes.io/projected/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-kube-api-access-bbbd9\") pod \"infra-operator-controller-manager-656bcbd775-drhrw\" (UID: \"716ce9a4-ee82-45d0-bb63-fab1eed3c1db\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612577 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6cm\" (UniqueName: \"kubernetes.io/projected/6b0569ca-c2be-4209-bce5-79feea80203c-kube-api-access-nt6cm\") pod \"horizon-operator-controller-manager-7ffbcb7588-nm5np\" (UID: \"6b0569ca-c2be-4209-bce5-79feea80203c\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612597 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75f2c\" (UniqueName: \"kubernetes.io/projected/3e418980-7ef9-45e8-8d7d-5bd866af0d26-kube-api-access-75f2c\") pod \"heat-operator-controller-manager-858f76bbdd-km9s8\" (UID: \"3e418980-7ef9-45e8-8d7d-5bd866af0d26\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612614 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkpft\" (UniqueName: \"kubernetes.io/projected/b41e76ff-471c-4874-88ab-66b6fae3a84a-kube-api-access-hkpft\") pod \"designate-operator-controller-manager-85d5d9dd78-2flss\" (UID: \"b41e76ff-471c-4874-88ab-66b6fae3a84a\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.612745 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" Oct 08 18:26:51 crc kubenswrapper[4750]: E1008 18:26:51.612813 4750 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 08 18:26:51 crc kubenswrapper[4750]: E1008 18:26:51.612923 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert podName:716ce9a4-ee82-45d0-bb63-fab1eed3c1db nodeName:}" failed. No retries permitted until 2025-10-08 18:26:52.112897128 +0000 UTC m=+968.025868141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert") pod "infra-operator-controller-manager-656bcbd775-drhrw" (UID: "716ce9a4-ee82-45d0-bb63-fab1eed3c1db") : secret "infra-operator-webhook-server-cert" not found Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.616401 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ht6x7" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.619642 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.641390 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.642761 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.650690 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.651153 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkpft\" (UniqueName: \"kubernetes.io/projected/b41e76ff-471c-4874-88ab-66b6fae3a84a-kube-api-access-hkpft\") pod \"designate-operator-controller-manager-85d5d9dd78-2flss\" (UID: \"b41e76ff-471c-4874-88ab-66b6fae3a84a\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.651975 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.652073 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.655241 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.656541 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4c6tv" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.656856 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75f2c\" (UniqueName: \"kubernetes.io/projected/3e418980-7ef9-45e8-8d7d-5bd866af0d26-kube-api-access-75f2c\") pod \"heat-operator-controller-manager-858f76bbdd-km9s8\" (UID: \"3e418980-7ef9-45e8-8d7d-5bd866af0d26\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.657416 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv6cf\" (UniqueName: \"kubernetes.io/projected/3e9cf3d8-0e7a-40ce-a32e-25e8700d8307-kube-api-access-rv6cf\") pod \"glance-operator-controller-manager-84b9b84486-pcb44\" (UID: \"3e9cf3d8-0e7a-40ce-a32e-25e8700d8307\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.657779 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.665022 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6cm\" (UniqueName: \"kubernetes.io/projected/6b0569ca-c2be-4209-bce5-79feea80203c-kube-api-access-nt6cm\") pod \"horizon-operator-controller-manager-7ffbcb7588-nm5np\" (UID: \"6b0569ca-c2be-4209-bce5-79feea80203c\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.665385 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-b44p6" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.665613 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.673097 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.673134 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbbd9\" (UniqueName: \"kubernetes.io/projected/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-kube-api-access-bbbd9\") pod \"infra-operator-controller-manager-656bcbd775-drhrw\" (UID: \"716ce9a4-ee82-45d0-bb63-fab1eed3c1db\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.674226 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.674257 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.675751 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.684945 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.688338 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xq9t6" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.688574 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-v9bmx" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.692926 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.697159 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.713921 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxbl8\" (UniqueName: \"kubernetes.io/projected/1f01efe4-3233-44d1-8def-b5080cc50aac-kube-api-access-fxbl8\") pod \"manila-operator-controller-manager-5f67fbc655-prrpn\" (UID: \"1f01efe4-3233-44d1-8def-b5080cc50aac\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.713959 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9g4\" (UniqueName: \"kubernetes.io/projected/6bc7e757-dd1c-4334-b83e-c4bc73e96658-kube-api-access-pv9g4\") pod \"keystone-operator-controller-manager-55b6b7c7b8-f857v\" (UID: \"6bc7e757-dd1c-4334-b83e-c4bc73e96658\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.713987 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-459w5\" (UniqueName: \"kubernetes.io/projected/909207ba-552c-4172-94c1-2e660e86e568-kube-api-access-459w5\") pod \"octavia-operator-controller-manager-69fdcfc5f5-xhnm8\" (UID: \"909207ba-552c-4172-94c1-2e660e86e568\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.714020 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtxz\" (UniqueName: \"kubernetes.io/projected/e03ff9b9-1896-4759-8bf2-66cb9e1a5a55-kube-api-access-cxtxz\") pod \"mariadb-operator-controller-manager-f9fb45f8f-k6wbx\" (UID: \"e03ff9b9-1896-4759-8bf2-66cb9e1a5a55\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.714039 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5knx8\" (UniqueName: \"kubernetes.io/projected/280ba6f4-ec1f-4afe-8a1a-db5f030495ae-kube-api-access-5knx8\") pod \"neutron-operator-controller-manager-79d585cb66-9wgqj\" (UID: \"280ba6f4-ec1f-4afe-8a1a-db5f030495ae\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.714066 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbz2c\" (UniqueName: \"kubernetes.io/projected/c53530b4-353e-4c78-87ad-baacd725dc79-kube-api-access-rbz2c\") pod \"nova-operator-controller-manager-5df598886f-w4sqr\" (UID: \"c53530b4-353e-4c78-87ad-baacd725dc79\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.714088 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-864d8\" (UniqueName: \"kubernetes.io/projected/9db814e8-3488-4ae4-ad24-b6f93f70c232-kube-api-access-864d8\") pod \"ovn-operator-controller-manager-79db49b9fb-bs9x4\" (UID: \"9db814e8-3488-4ae4-ad24-b6f93f70c232\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.714126 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99vtr\" (UniqueName: \"kubernetes.io/projected/47b84278-c660-43eb-8c7d-686ceed80afd-kube-api-access-99vtr\") pod \"ironic-operator-controller-manager-9c5c78d49-s454n\" (UID: \"47b84278-c660-43eb-8c7d-686ceed80afd\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.721653 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.723373 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.727936 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-k777g" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.732090 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.745645 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv9g4\" (UniqueName: \"kubernetes.io/projected/6bc7e757-dd1c-4334-b83e-c4bc73e96658-kube-api-access-pv9g4\") pod \"keystone-operator-controller-manager-55b6b7c7b8-f857v\" (UID: \"6bc7e757-dd1c-4334-b83e-c4bc73e96658\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.752149 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxtxz\" (UniqueName: \"kubernetes.io/projected/e03ff9b9-1896-4759-8bf2-66cb9e1a5a55-kube-api-access-cxtxz\") pod \"mariadb-operator-controller-manager-f9fb45f8f-k6wbx\" (UID: \"e03ff9b9-1896-4759-8bf2-66cb9e1a5a55\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.762568 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5knx8\" (UniqueName: \"kubernetes.io/projected/280ba6f4-ec1f-4afe-8a1a-db5f030495ae-kube-api-access-5knx8\") pod \"neutron-operator-controller-manager-79d585cb66-9wgqj\" (UID: \"280ba6f4-ec1f-4afe-8a1a-db5f030495ae\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.765335 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxbl8\" (UniqueName: \"kubernetes.io/projected/1f01efe4-3233-44d1-8def-b5080cc50aac-kube-api-access-fxbl8\") pod \"manila-operator-controller-manager-5f67fbc655-prrpn\" (UID: \"1f01efe4-3233-44d1-8def-b5080cc50aac\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.772120 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vtr\" (UniqueName: \"kubernetes.io/projected/47b84278-c660-43eb-8c7d-686ceed80afd-kube-api-access-99vtr\") pod \"ironic-operator-controller-manager-9c5c78d49-s454n\" (UID: \"47b84278-c660-43eb-8c7d-686ceed80afd\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.792179 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56c698c775-jsq4j"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.793404 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.801622 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.802585 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x598t" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.815738 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7smk\" (UniqueName: \"kubernetes.io/projected/2239d744-45d5-40ac-b30a-929c4edd8489-kube-api-access-g7smk\") pod \"telemetry-operator-controller-manager-76796d4c6b-vrwkf\" (UID: \"2239d744-45d5-40ac-b30a-929c4edd8489\") " pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.815772 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rkfl\" (UniqueName: \"kubernetes.io/projected/18f1f857-f60c-4c47-b5c9-d60207e60ef5-kube-api-access-2rkfl\") pod \"swift-operator-controller-manager-db6d7f97b-qnpbb\" (UID: \"18f1f857-f60c-4c47-b5c9-d60207e60ef5\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.815807 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-459w5\" (UniqueName: \"kubernetes.io/projected/909207ba-552c-4172-94c1-2e660e86e568-kube-api-access-459w5\") pod \"octavia-operator-controller-manager-69fdcfc5f5-xhnm8\" (UID: \"909207ba-552c-4172-94c1-2e660e86e568\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.815851 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg\" (UID: \"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.815908 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbz2c\" (UniqueName: \"kubernetes.io/projected/c53530b4-353e-4c78-87ad-baacd725dc79-kube-api-access-rbz2c\") pod \"nova-operator-controller-manager-5df598886f-w4sqr\" (UID: \"c53530b4-353e-4c78-87ad-baacd725dc79\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.815931 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjxfv\" (UniqueName: \"kubernetes.io/projected/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-kube-api-access-zjxfv\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg\" (UID: \"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.815957 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-864d8\" (UniqueName: \"kubernetes.io/projected/9db814e8-3488-4ae4-ad24-b6f93f70c232-kube-api-access-864d8\") pod \"ovn-operator-controller-manager-79db49b9fb-bs9x4\" (UID: \"9db814e8-3488-4ae4-ad24-b6f93f70c232\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.815988 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmnd6\" (UniqueName: \"kubernetes.io/projected/9323399b-7f22-41aa-9db7-99772d8d6719-kube-api-access-vmnd6\") pod \"placement-operator-controller-manager-68b6c87b68-kn2bm\" (UID: \"9323399b-7f22-41aa-9db7-99772d8d6719\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.836944 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56c698c775-jsq4j"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.843630 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-864d8\" (UniqueName: \"kubernetes.io/projected/9db814e8-3488-4ae4-ad24-b6f93f70c232-kube-api-access-864d8\") pod \"ovn-operator-controller-manager-79db49b9fb-bs9x4\" (UID: \"9db814e8-3488-4ae4-ad24-b6f93f70c232\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.858017 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-459w5\" (UniqueName: \"kubernetes.io/projected/909207ba-552c-4172-94c1-2e660e86e568-kube-api-access-459w5\") pod \"octavia-operator-controller-manager-69fdcfc5f5-xhnm8\" (UID: \"909207ba-552c-4172-94c1-2e660e86e568\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.885022 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.886850 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.898463 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-w4cmt" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.916982 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q"] Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.917577 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjxfv\" (UniqueName: \"kubernetes.io/projected/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-kube-api-access-zjxfv\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg\" (UID: \"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.917679 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmnd6\" (UniqueName: \"kubernetes.io/projected/9323399b-7f22-41aa-9db7-99772d8d6719-kube-api-access-vmnd6\") pod \"placement-operator-controller-manager-68b6c87b68-kn2bm\" (UID: \"9323399b-7f22-41aa-9db7-99772d8d6719\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.917801 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7smk\" (UniqueName: \"kubernetes.io/projected/2239d744-45d5-40ac-b30a-929c4edd8489-kube-api-access-g7smk\") pod \"telemetry-operator-controller-manager-76796d4c6b-vrwkf\" (UID: \"2239d744-45d5-40ac-b30a-929c4edd8489\") " pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.917822 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rkfl\" (UniqueName: \"kubernetes.io/projected/18f1f857-f60c-4c47-b5c9-d60207e60ef5-kube-api-access-2rkfl\") pod \"swift-operator-controller-manager-db6d7f97b-qnpbb\" (UID: \"18f1f857-f60c-4c47-b5c9-d60207e60ef5\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.917887 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg\" (UID: \"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.917959 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjgbv\" (UniqueName: \"kubernetes.io/projected/9b3555bd-2455-4844-a0a7-16ddfca85ce5-kube-api-access-fjgbv\") pod \"test-operator-controller-manager-56c698c775-jsq4j\" (UID: \"9b3555bd-2455-4844-a0a7-16ddfca85ce5\") " pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.918437 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbz2c\" (UniqueName: \"kubernetes.io/projected/c53530b4-353e-4c78-87ad-baacd725dc79-kube-api-access-rbz2c\") pod \"nova-operator-controller-manager-5df598886f-w4sqr\" (UID: \"c53530b4-353e-4c78-87ad-baacd725dc79\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" Oct 08 18:26:51 crc kubenswrapper[4750]: E1008 18:26:51.919177 4750 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 18:26:51 crc kubenswrapper[4750]: E1008 18:26:51.919928 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert podName:7bda6fa0-aa4a-4e4b-9165-c50b976bdce7 nodeName:}" failed. No retries permitted until 2025-10-08 18:26:52.419908893 +0000 UTC m=+968.332879906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert") pod "openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" (UID: "7bda6fa0-aa4a-4e4b-9165-c50b976bdce7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.929834 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.942840 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.947359 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmnd6\" (UniqueName: \"kubernetes.io/projected/9323399b-7f22-41aa-9db7-99772d8d6719-kube-api-access-vmnd6\") pod \"placement-operator-controller-manager-68b6c87b68-kn2bm\" (UID: \"9323399b-7f22-41aa-9db7-99772d8d6719\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.952685 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjxfv\" (UniqueName: \"kubernetes.io/projected/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-kube-api-access-zjxfv\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg\" (UID: \"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.964742 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rkfl\" (UniqueName: \"kubernetes.io/projected/18f1f857-f60c-4c47-b5c9-d60207e60ef5-kube-api-access-2rkfl\") pod \"swift-operator-controller-manager-db6d7f97b-qnpbb\" (UID: \"18f1f857-f60c-4c47-b5c9-d60207e60ef5\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.974438 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" Oct 08 18:26:51 crc kubenswrapper[4750]: I1008 18:26:51.975995 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7smk\" (UniqueName: \"kubernetes.io/projected/2239d744-45d5-40ac-b30a-929c4edd8489-kube-api-access-g7smk\") pod \"telemetry-operator-controller-manager-76796d4c6b-vrwkf\" (UID: \"2239d744-45d5-40ac-b30a-929c4edd8489\") " pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.026911 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.028205 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjgbv\" (UniqueName: \"kubernetes.io/projected/9b3555bd-2455-4844-a0a7-16ddfca85ce5-kube-api-access-fjgbv\") pod \"test-operator-controller-manager-56c698c775-jsq4j\" (UID: \"9b3555bd-2455-4844-a0a7-16ddfca85ce5\") " pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.028335 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sd8n\" (UniqueName: \"kubernetes.io/projected/96e0828f-b805-4aa0-b0fc-e64c3aeb6120-kube-api-access-6sd8n\") pod \"watcher-operator-controller-manager-7794bc6bd-7876q\" (UID: \"96e0828f-b805-4aa0-b0fc-e64c3aeb6120\") " pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.030529 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.047013 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.048584 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.061777 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjgbv\" (UniqueName: \"kubernetes.io/projected/9b3555bd-2455-4844-a0a7-16ddfca85ce5-kube-api-access-fjgbv\") pod \"test-operator-controller-manager-56c698c775-jsq4j\" (UID: \"9b3555bd-2455-4844-a0a7-16ddfca85ce5\") " pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.062875 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.094470 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.097596 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.099144 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.103461 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qlzbh" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.103495 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.130605 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert\") pod \"infra-operator-controller-manager-656bcbd775-drhrw\" (UID: \"716ce9a4-ee82-45d0-bb63-fab1eed3c1db\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.130899 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sd8n\" (UniqueName: \"kubernetes.io/projected/96e0828f-b805-4aa0-b0fc-e64c3aeb6120-kube-api-access-6sd8n\") pod \"watcher-operator-controller-manager-7794bc6bd-7876q\" (UID: \"96e0828f-b805-4aa0-b0fc-e64c3aeb6120\") " pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" Oct 08 18:26:52 crc kubenswrapper[4750]: E1008 18:26:52.133635 4750 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 08 18:26:52 crc kubenswrapper[4750]: E1008 18:26:52.133752 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert podName:716ce9a4-ee82-45d0-bb63-fab1eed3c1db nodeName:}" failed. No retries permitted until 2025-10-08 18:26:53.133712572 +0000 UTC m=+969.046683585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert") pod "infra-operator-controller-manager-656bcbd775-drhrw" (UID: "716ce9a4-ee82-45d0-bb63-fab1eed3c1db") : secret "infra-operator-webhook-server-cert" not found Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.159525 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.159659 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sd8n\" (UniqueName: \"kubernetes.io/projected/96e0828f-b805-4aa0-b0fc-e64c3aeb6120-kube-api-access-6sd8n\") pod \"watcher-operator-controller-manager-7794bc6bd-7876q\" (UID: \"96e0828f-b805-4aa0-b0fc-e64c3aeb6120\") " pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.163580 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.166728 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.166839 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.172483 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-57hql" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.182129 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.212511 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8q5dm" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.213599 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.228757 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.232514 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6115ecee-36ba-4281-afa3-21685170393b-cert\") pod \"openstack-operator-controller-manager-598c4c5b5-s82d2\" (UID: \"6115ecee-36ba-4281-afa3-21685170393b\") " pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.232622 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm6pb\" (UniqueName: \"kubernetes.io/projected/6115ecee-36ba-4281-afa3-21685170393b-kube-api-access-dm6pb\") pod \"openstack-operator-controller-manager-598c4c5b5-s82d2\" (UID: \"6115ecee-36ba-4281-afa3-21685170393b\") " pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.247993 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.277482 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.292526 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.298989 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.336159 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm6pb\" (UniqueName: \"kubernetes.io/projected/6115ecee-36ba-4281-afa3-21685170393b-kube-api-access-dm6pb\") pod \"openstack-operator-controller-manager-598c4c5b5-s82d2\" (UID: \"6115ecee-36ba-4281-afa3-21685170393b\") " pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.336247 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f29tk\" (UniqueName: \"kubernetes.io/projected/bc8e10c4-5871-4909-af47-4b2c1c78d3be-kube-api-access-f29tk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk\" (UID: \"bc8e10c4-5871-4909-af47-4b2c1c78d3be\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.336321 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6115ecee-36ba-4281-afa3-21685170393b-cert\") pod \"openstack-operator-controller-manager-598c4c5b5-s82d2\" (UID: \"6115ecee-36ba-4281-afa3-21685170393b\") " pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.355915 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm6pb\" (UniqueName: \"kubernetes.io/projected/6115ecee-36ba-4281-afa3-21685170393b-kube-api-access-dm6pb\") pod \"openstack-operator-controller-manager-598c4c5b5-s82d2\" (UID: \"6115ecee-36ba-4281-afa3-21685170393b\") " pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.376421 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6115ecee-36ba-4281-afa3-21685170393b-cert\") pod \"openstack-operator-controller-manager-598c4c5b5-s82d2\" (UID: \"6115ecee-36ba-4281-afa3-21685170393b\") " pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.430488 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.438008 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f29tk\" (UniqueName: \"kubernetes.io/projected/bc8e10c4-5871-4909-af47-4b2c1c78d3be-kube-api-access-f29tk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk\" (UID: \"bc8e10c4-5871-4909-af47-4b2c1c78d3be\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.438115 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg\" (UID: \"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:52 crc kubenswrapper[4750]: E1008 18:26:52.438268 4750 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 18:26:52 crc kubenswrapper[4750]: E1008 18:26:52.438327 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert podName:7bda6fa0-aa4a-4e4b-9165-c50b976bdce7 nodeName:}" failed. No retries permitted until 2025-10-08 18:26:53.438309279 +0000 UTC m=+969.351280292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert") pod "openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" (UID: "7bda6fa0-aa4a-4e4b-9165-c50b976bdce7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.453878 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.457493 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f29tk\" (UniqueName: \"kubernetes.io/projected/bc8e10c4-5871-4909-af47-4b2c1c78d3be-kube-api-access-f29tk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk\" (UID: \"bc8e10c4-5871-4909-af47-4b2c1c78d3be\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" Oct 08 18:26:52 crc kubenswrapper[4750]: W1008 18:26:52.498401 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b0569ca_c2be_4209_bce5_79feea80203c.slice/crio-4dbc667026039964dd3a17378ef0a363a516c0ef76173b0cab7d728625ee775b WatchSource:0}: Error finding container 4dbc667026039964dd3a17378ef0a363a516c0ef76173b0cab7d728625ee775b: Status 404 returned error can't find the container with id 4dbc667026039964dd3a17378ef0a363a516c0ef76173b0cab7d728625ee775b Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.544653 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.605650 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.904099 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.934837 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44"] Oct 08 18:26:52 crc kubenswrapper[4750]: I1008 18:26:52.941517 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx"] Oct 08 18:26:52 crc kubenswrapper[4750]: W1008 18:26:52.942529 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e9cf3d8_0e7a_40ce_a32e_25e8700d8307.slice/crio-fda7346f3be68b84fbee6eaf1268084a464817c9d464f5b53eff61b248ca4bf6 WatchSource:0}: Error finding container fda7346f3be68b84fbee6eaf1268084a464817c9d464f5b53eff61b248ca4bf6: Status 404 returned error can't find the container with id fda7346f3be68b84fbee6eaf1268084a464817c9d464f5b53eff61b248ca4bf6 Oct 08 18:26:52 crc kubenswrapper[4750]: W1008 18:26:52.946336 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03ff9b9_1896_4759_8bf2_66cb9e1a5a55.slice/crio-1e58711c9f5917c4759ed82137d13897a23993828c496b07c9917087a9450cf8 WatchSource:0}: Error finding container 1e58711c9f5917c4759ed82137d13897a23993828c496b07c9917087a9450cf8: Status 404 returned error can't find the container with id 1e58711c9f5917c4759ed82137d13897a23993828c496b07c9917087a9450cf8 Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.151759 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert\") pod \"infra-operator-controller-manager-656bcbd775-drhrw\" (UID: \"716ce9a4-ee82-45d0-bb63-fab1eed3c1db\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.161215 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716ce9a4-ee82-45d0-bb63-fab1eed3c1db-cert\") pod \"infra-operator-controller-manager-656bcbd775-drhrw\" (UID: \"716ce9a4-ee82-45d0-bb63-fab1eed3c1db\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.221312 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.250617 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" event={"ID":"b41e76ff-471c-4874-88ab-66b6fae3a84a","Type":"ContainerStarted","Data":"e0a641176d2ca3bb863f600b31745afa9effe7c91bc17ee37d692c67505f5b02"} Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.253197 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" event={"ID":"b22d34d7-5452-4982-96ff-100a0fbdf514","Type":"ContainerStarted","Data":"b6e8e244c0f3a2d4c10a14be97817b857f8582c563d8311a3d0a1ffa2f7c6cb8"} Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.254521 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" event={"ID":"e03ff9b9-1896-4759-8bf2-66cb9e1a5a55","Type":"ContainerStarted","Data":"1e58711c9f5917c4759ed82137d13897a23993828c496b07c9917087a9450cf8"} Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.255544 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" event={"ID":"3e9cf3d8-0e7a-40ce-a32e-25e8700d8307","Type":"ContainerStarted","Data":"fda7346f3be68b84fbee6eaf1268084a464817c9d464f5b53eff61b248ca4bf6"} Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.256879 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" event={"ID":"6b0569ca-c2be-4209-bce5-79feea80203c","Type":"ContainerStarted","Data":"4dbc667026039964dd3a17378ef0a363a516c0ef76173b0cab7d728625ee775b"} Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.257860 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" event={"ID":"3e418980-7ef9-45e8-8d7d-5bd866af0d26","Type":"ContainerStarted","Data":"f42d9f87614a62bcacf84bd7f86fd3a69852302d38bc6f30fc8e6e32d9468304"} Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.328175 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.344483 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.353717 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.362638 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.365348 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.372196 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.376596 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.381110 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.384611 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.388594 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.392396 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.396307 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.407883 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.429824 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q"] Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.435035 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56c698c775-jsq4j"] Oct 08 18:26:53 crc kubenswrapper[4750]: W1008 18:26:53.453062 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280ba6f4_ec1f_4afe_8a1a_db5f030495ae.slice/crio-67ea2aed37b4540494f5e78769fbb9f325c8a0a0ce0cdb49053c38335be79462 WatchSource:0}: Error finding container 67ea2aed37b4540494f5e78769fbb9f325c8a0a0ce0cdb49053c38335be79462: Status 404 returned error can't find the container with id 67ea2aed37b4540494f5e78769fbb9f325c8a0a0ce0cdb49053c38335be79462 Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.456336 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg\" (UID: \"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:53 crc kubenswrapper[4750]: W1008 18:26:53.456837 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2239d744_45d5_40ac_b30a_929c4edd8489.slice/crio-4882a80190149f2bef98d505727dd0ac0cbc69d5a8693f479650332f33aa43c5 WatchSource:0}: Error finding container 4882a80190149f2bef98d505727dd0ac0cbc69d5a8693f479650332f33aa43c5: Status 404 returned error can't find the container with id 4882a80190149f2bef98d505727dd0ac0cbc69d5a8693f479650332f33aa43c5 Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.462007 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:9d26476523320d70d6d457b91663e8c233ed320d77032a7c57a89ce1aedd3931,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7smk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76796d4c6b-vrwkf_openstack-operators(2239d744-45d5-40ac-b30a-929c4edd8489): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.463772 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bda6fa0-aa4a-4e4b-9165-c50b976bdce7-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg\" (UID: \"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:53 crc kubenswrapper[4750]: W1008 18:26:53.466771 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc8e10c4_5871_4909_af47_4b2c1c78d3be.slice/crio-cf0c78a22a8a93ede8100869c34916f1e13722b7bdc1ad8a28b2c3f11c7ed43f WatchSource:0}: Error finding container cf0c78a22a8a93ede8100869c34916f1e13722b7bdc1ad8a28b2c3f11c7ed43f: Status 404 returned error can't find the container with id cf0c78a22a8a93ede8100869c34916f1e13722b7bdc1ad8a28b2c3f11c7ed43f Oct 08 18:26:53 crc kubenswrapper[4750]: W1008 18:26:53.476787 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3555bd_2455_4844_a0a7_16ddfca85ce5.slice/crio-de58aef24b764044e8751fb4d2e6ba70ae314c9cfa1bb9bf9d98cc15af440371 WatchSource:0}: Error finding container de58aef24b764044e8751fb4d2e6ba70ae314c9cfa1bb9bf9d98cc15af440371: Status 404 returned error can't find the container with id de58aef24b764044e8751fb4d2e6ba70ae314c9cfa1bb9bf9d98cc15af440371 Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.478018 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f29tk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk_openstack-operators(bc8e10c4-5871-4909-af47-4b2c1c78d3be): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.478489 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6sd8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7794bc6bd-7876q_openstack-operators(96e0828f-b805-4aa0-b0fc-e64c3aeb6120): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.479222 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" podUID="bc8e10c4-5871-4909-af47-4b2c1c78d3be" Oct 08 18:26:53 crc kubenswrapper[4750]: W1008 18:26:53.479384 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07dc6c54_47cf_48e9_aed2_3016f69594de.slice/crio-0f62f4131d0f04543ad73c16351368b0ab1974ec70dd91bbd176e001eac19c4f WatchSource:0}: Error finding container 0f62f4131d0f04543ad73c16351368b0ab1974ec70dd91bbd176e001eac19c4f: Status 404 returned error can't find the container with id 0f62f4131d0f04543ad73c16351368b0ab1974ec70dd91bbd176e001eac19c4f Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.490303 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8k7gj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-7b7fb68549-k8v2z_openstack-operators(07dc6c54-47cf-48e9-aed2-3016f69594de): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.490423 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fjgbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56c698c775-jsq4j_openstack-operators(9b3555bd-2455-4844-a0a7-16ddfca85ce5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.629043 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.643796 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" podUID="2239d744-45d5-40ac-b30a-929c4edd8489" Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.703927 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw"] Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.772077 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" podUID="96e0828f-b805-4aa0-b0fc-e64c3aeb6120" Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.804293 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" podUID="9b3555bd-2455-4844-a0a7-16ddfca85ce5" Oct 08 18:26:53 crc kubenswrapper[4750]: I1008 18:26:53.900752 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg"] Oct 08 18:26:53 crc kubenswrapper[4750]: E1008 18:26:53.906693 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" podUID="07dc6c54-47cf-48e9-aed2-3016f69594de" Oct 08 18:26:53 crc kubenswrapper[4750]: W1008 18:26:53.915294 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bda6fa0_aa4a_4e4b_9165_c50b976bdce7.slice/crio-51d937fd2e14111281039250b1ba644b8b8519563c92a3695990a85da063dbc3 WatchSource:0}: Error finding container 51d937fd2e14111281039250b1ba644b8b8519563c92a3695990a85da063dbc3: Status 404 returned error can't find the container with id 51d937fd2e14111281039250b1ba644b8b8519563c92a3695990a85da063dbc3 Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.279314 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" event={"ID":"280ba6f4-ec1f-4afe-8a1a-db5f030495ae","Type":"ContainerStarted","Data":"67ea2aed37b4540494f5e78769fbb9f325c8a0a0ce0cdb49053c38335be79462"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.280595 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" event={"ID":"c53530b4-353e-4c78-87ad-baacd725dc79","Type":"ContainerStarted","Data":"a1eb6e837f9bc093e6c515c87e312592308c667fe16b59a7fe177c57cf34d0f5"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.295509 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" event={"ID":"2239d744-45d5-40ac-b30a-929c4edd8489","Type":"ContainerStarted","Data":"b1984be23c6dce286441167cc8f202ecb3f84c503ce4d625c6150b83eb3ca1c9"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.295588 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" event={"ID":"2239d744-45d5-40ac-b30a-929c4edd8489","Type":"ContainerStarted","Data":"4882a80190149f2bef98d505727dd0ac0cbc69d5a8693f479650332f33aa43c5"} Oct 08 18:26:54 crc kubenswrapper[4750]: E1008 18:26:54.297281 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:9d26476523320d70d6d457b91663e8c233ed320d77032a7c57a89ce1aedd3931\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" podUID="2239d744-45d5-40ac-b30a-929c4edd8489" Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.297877 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" event={"ID":"1f01efe4-3233-44d1-8def-b5080cc50aac","Type":"ContainerStarted","Data":"36dac011e63f170293709f060e86a918b2b36a0a8a319957e20e5e812579bb60"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.302940 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" event={"ID":"9b3555bd-2455-4844-a0a7-16ddfca85ce5","Type":"ContainerStarted","Data":"534a94e2000f40125340aaa3704db71dbf7e333e9567d4df6557654bb7b6eae0"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.302967 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" event={"ID":"9b3555bd-2455-4844-a0a7-16ddfca85ce5","Type":"ContainerStarted","Data":"de58aef24b764044e8751fb4d2e6ba70ae314c9cfa1bb9bf9d98cc15af440371"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.307585 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" event={"ID":"6115ecee-36ba-4281-afa3-21685170393b","Type":"ContainerStarted","Data":"9e6fabf97880243fa150632f0dbed10a1befa44b5b822b2455745d7d8960b1fc"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.307622 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" event={"ID":"6115ecee-36ba-4281-afa3-21685170393b","Type":"ContainerStarted","Data":"02f4e9528c6c241e6f63e91e1d7dd0faa6b754e025bff1142d23e278b6cd2f1a"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.307634 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" event={"ID":"6115ecee-36ba-4281-afa3-21685170393b","Type":"ContainerStarted","Data":"b7007141b6fef83df13d979c6c79cc48c972f6dd4e3d3133f293354e59d0c18e"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.308164 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.309210 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" event={"ID":"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7","Type":"ContainerStarted","Data":"51d937fd2e14111281039250b1ba644b8b8519563c92a3695990a85da063dbc3"} Oct 08 18:26:54 crc kubenswrapper[4750]: E1008 18:26:54.318074 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9\\\"\"" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" podUID="9b3555bd-2455-4844-a0a7-16ddfca85ce5" Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.329529 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" event={"ID":"6bc7e757-dd1c-4334-b83e-c4bc73e96658","Type":"ContainerStarted","Data":"6acf3c8baa9374579fc943ad4025288e6cc1fd6cea63b2616c08a3235e223d23"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.344882 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" event={"ID":"47b84278-c660-43eb-8c7d-686ceed80afd","Type":"ContainerStarted","Data":"6969aa315ac23759d8154bf0000e399735e1f700dfc6e4a30f2f3e068a24c38b"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.351200 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" event={"ID":"07dc6c54-47cf-48e9-aed2-3016f69594de","Type":"ContainerStarted","Data":"324ac617f280e55433f6450e92cd785d63cf2fc80a3fd33e648889661be5bc23"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.351235 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" event={"ID":"07dc6c54-47cf-48e9-aed2-3016f69594de","Type":"ContainerStarted","Data":"0f62f4131d0f04543ad73c16351368b0ab1974ec70dd91bbd176e001eac19c4f"} Oct 08 18:26:54 crc kubenswrapper[4750]: E1008 18:26:54.355720 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" podUID="07dc6c54-47cf-48e9-aed2-3016f69594de" Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.379598 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" event={"ID":"96e0828f-b805-4aa0-b0fc-e64c3aeb6120","Type":"ContainerStarted","Data":"18d7ccfe590968e3ef91786c37167479a3ec46d2bd6036cae6c2fe62646432c2"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.379652 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" event={"ID":"96e0828f-b805-4aa0-b0fc-e64c3aeb6120","Type":"ContainerStarted","Data":"1ade5ac65648f585b8d7210e0814d919d5ff17d49a28ba06b22d5f3817645b3a"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.382155 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" podStartSLOduration=2.382138929 podStartE2EDuration="2.382138929s" podCreationTimestamp="2025-10-08 18:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:26:54.380436608 +0000 UTC m=+970.293407621" watchObservedRunningTime="2025-10-08 18:26:54.382138929 +0000 UTC m=+970.295109942" Oct 08 18:26:54 crc kubenswrapper[4750]: E1008 18:26:54.384041 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" podUID="96e0828f-b805-4aa0-b0fc-e64c3aeb6120" Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.391500 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" event={"ID":"18f1f857-f60c-4c47-b5c9-d60207e60ef5","Type":"ContainerStarted","Data":"1a2ec9104f4c0b0fd705947893535a3c4499ae1ce65eac2bc23f02723c24bf50"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.404433 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" event={"ID":"9db814e8-3488-4ae4-ad24-b6f93f70c232","Type":"ContainerStarted","Data":"c992f5a14c341e59107077c17045f9e5b41f6d65b4358c2b8024dfe909e29948"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.412881 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" event={"ID":"716ce9a4-ee82-45d0-bb63-fab1eed3c1db","Type":"ContainerStarted","Data":"75c891b095aef10799507868e0dcdb5d8c0a78681629e7692b6110ad7e887597"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.425685 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" event={"ID":"9323399b-7f22-41aa-9db7-99772d8d6719","Type":"ContainerStarted","Data":"3bdb915934a86241e7db2990077a88733a0af9f8902a7713bd0a062f6adc3bc4"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.453661 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" event={"ID":"909207ba-552c-4172-94c1-2e660e86e568","Type":"ContainerStarted","Data":"453a67c7e49be4141cd35c63cd8e48fd19b65d5188a586b8c4f5be441a391a3f"} Oct 08 18:26:54 crc kubenswrapper[4750]: I1008 18:26:54.456927 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" event={"ID":"bc8e10c4-5871-4909-af47-4b2c1c78d3be","Type":"ContainerStarted","Data":"cf0c78a22a8a93ede8100869c34916f1e13722b7bdc1ad8a28b2c3f11c7ed43f"} Oct 08 18:26:54 crc kubenswrapper[4750]: E1008 18:26:54.461342 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" podUID="bc8e10c4-5871-4909-af47-4b2c1c78d3be" Oct 08 18:26:55 crc kubenswrapper[4750]: E1008 18:26:55.469273 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:9d26476523320d70d6d457b91663e8c233ed320d77032a7c57a89ce1aedd3931\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" podUID="2239d744-45d5-40ac-b30a-929c4edd8489" Oct 08 18:26:55 crc kubenswrapper[4750]: E1008 18:26:55.469349 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9\\\"\"" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" podUID="9b3555bd-2455-4844-a0a7-16ddfca85ce5" Oct 08 18:26:55 crc kubenswrapper[4750]: E1008 18:26:55.469480 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" podUID="07dc6c54-47cf-48e9-aed2-3016f69594de" Oct 08 18:26:55 crc kubenswrapper[4750]: E1008 18:26:55.469466 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" podUID="bc8e10c4-5871-4909-af47-4b2c1c78d3be" Oct 08 18:26:55 crc kubenswrapper[4750]: E1008 18:26:55.473891 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" podUID="96e0828f-b805-4aa0-b0fc-e64c3aeb6120" Oct 08 18:27:02 crc kubenswrapper[4750]: I1008 18:27:02.440112 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-598c4c5b5-s82d2" Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.526980 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" event={"ID":"c53530b4-353e-4c78-87ad-baacd725dc79","Type":"ContainerStarted","Data":"eecbf5f06ea11a9eccc8cea044141ab946bd77d8f6533a4e8a5639453396f571"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.537138 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" event={"ID":"716ce9a4-ee82-45d0-bb63-fab1eed3c1db","Type":"ContainerStarted","Data":"ad7507adcc43c8615bdab56c01f99b3e820e91e255cbfd8d8a5b5c1ba32ab76f"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.537182 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" event={"ID":"716ce9a4-ee82-45d0-bb63-fab1eed3c1db","Type":"ContainerStarted","Data":"b7efc0ddd0dba91e8a7b124309d37b0cbeb405c7814c5bf26b8040835d77cb5c"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.538204 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.544853 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" event={"ID":"1f01efe4-3233-44d1-8def-b5080cc50aac","Type":"ContainerStarted","Data":"3e7714d84643a79c31c0aeb1c5dde7bf9c28fe0377b98c53ac6d70a8181ba45e"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.561853 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" podStartSLOduration=3.915071784 podStartE2EDuration="12.561830612s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.738016277 +0000 UTC m=+969.650987290" lastFinishedPulling="2025-10-08 18:27:02.384775105 +0000 UTC m=+978.297746118" observedRunningTime="2025-10-08 18:27:03.556360328 +0000 UTC m=+979.469331341" watchObservedRunningTime="2025-10-08 18:27:03.561830612 +0000 UTC m=+979.474801615" Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.574793 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" event={"ID":"b22d34d7-5452-4982-96ff-100a0fbdf514","Type":"ContainerStarted","Data":"7124bc39a45771800b9b8376dc845053b8b5c282a9c4caf6462898da5bb06863"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.574832 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" event={"ID":"b22d34d7-5452-4982-96ff-100a0fbdf514","Type":"ContainerStarted","Data":"5fe05c246d5f943d3009c9f049fcd860cffccab13fa8c7efab37cac224bb24a5"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.575290 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.585149 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" event={"ID":"b41e76ff-471c-4874-88ab-66b6fae3a84a","Type":"ContainerStarted","Data":"e329c4930bbe6b848ab3c033448cd677448b63e886ff12672570fb212aa8a308"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.594318 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" event={"ID":"e03ff9b9-1896-4759-8bf2-66cb9e1a5a55","Type":"ContainerStarted","Data":"60805222464a01ba43bc42b1d914506dd6e7622eb8cc48e31a48afcb761be107"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.615534 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" podStartSLOduration=2.5975449040000003 podStartE2EDuration="12.61551746s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:52.356843054 +0000 UTC m=+968.269814067" lastFinishedPulling="2025-10-08 18:27:02.37481561 +0000 UTC m=+978.287786623" observedRunningTime="2025-10-08 18:27:03.615329664 +0000 UTC m=+979.528300677" watchObservedRunningTime="2025-10-08 18:27:03.61551746 +0000 UTC m=+979.528488463" Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.627746 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" event={"ID":"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7","Type":"ContainerStarted","Data":"5aec41e0e59c257e87168a69d3643d8d1d2f4162d78ad1e6515d4384fe2443cf"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.645716 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" event={"ID":"280ba6f4-ec1f-4afe-8a1a-db5f030495ae","Type":"ContainerStarted","Data":"a5056343f21e3dc7e6ad04802687ec2e28413a7a5262ec8c2dee68bdb417124b"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.669867 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" event={"ID":"9db814e8-3488-4ae4-ad24-b6f93f70c232","Type":"ContainerStarted","Data":"447d4f428b42c85cd9e4cfae2c2ef5472812434c5ed5e3979f931f4678033530"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.701956 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" event={"ID":"9323399b-7f22-41aa-9db7-99772d8d6719","Type":"ContainerStarted","Data":"cab9133720d24a006034f03c0a07a52bf6371e65a29d89e5d98bf46f3a7a798b"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.703202 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" event={"ID":"6bc7e757-dd1c-4334-b83e-c4bc73e96658","Type":"ContainerStarted","Data":"cc87676d32b627fdc44bf50f75bfab5f4ddc780f708eb7c871ae592ff8cbee55"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.704270 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" event={"ID":"47b84278-c660-43eb-8c7d-686ceed80afd","Type":"ContainerStarted","Data":"b5b8c8e26a53a9115bb94143bc35a1e4739d4222224c4652f0da93f30756d6ac"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.704287 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" event={"ID":"47b84278-c660-43eb-8c7d-686ceed80afd","Type":"ContainerStarted","Data":"244f7abe0499a5ff730b90b5b2c5ac63532fa76f7dab0b5bd45c28bc20d8d5c2"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.705273 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.726720 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" event={"ID":"6b0569ca-c2be-4209-bce5-79feea80203c","Type":"ContainerStarted","Data":"3dec86482a88702d3035635445e93ff9425bbd1c64c49f8f48b0dd6de60779b5"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.726764 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" event={"ID":"6b0569ca-c2be-4209-bce5-79feea80203c","Type":"ContainerStarted","Data":"36f0989887ba8ed318ecdce54fbeeb7a10f5b4383e5c5ee628d1f6ccdb788e69"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.727469 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.736808 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" event={"ID":"18f1f857-f60c-4c47-b5c9-d60207e60ef5","Type":"ContainerStarted","Data":"48df3ca7f09076de14b317f6b3868980893edd035032c4109d88e8ecfbf8d099"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.756603 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" podStartSLOduration=3.8087962319999997 podStartE2EDuration="12.756586359s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.433334988 +0000 UTC m=+969.346305991" lastFinishedPulling="2025-10-08 18:27:02.381125105 +0000 UTC m=+978.294096118" observedRunningTime="2025-10-08 18:27:03.754374085 +0000 UTC m=+979.667345098" watchObservedRunningTime="2025-10-08 18:27:03.756586359 +0000 UTC m=+979.669557382" Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.773779 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" event={"ID":"909207ba-552c-4172-94c1-2e660e86e568","Type":"ContainerStarted","Data":"73433cfbe87538dec47ca3e4f33c58a417530ee200e320601ed18e6b1960866a"} Oct 08 18:27:03 crc kubenswrapper[4750]: I1008 18:27:03.811049 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" event={"ID":"3e9cf3d8-0e7a-40ce-a32e-25e8700d8307","Type":"ContainerStarted","Data":"d6590f84ffefca77053cf65d482ca078cc65d4d876d6757c7e8f6d0e5c441910"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.818111 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" event={"ID":"c53530b4-353e-4c78-87ad-baacd725dc79","Type":"ContainerStarted","Data":"abdbefdfdbbef6d2834fcc6c06f46a0df753a97a5bf2450380b91ea9c1786704"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.819382 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.820953 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" event={"ID":"9db814e8-3488-4ae4-ad24-b6f93f70c232","Type":"ContainerStarted","Data":"37606d266a007d8e8fb4cd027f4940647f189bd703862a48d31affc54fb6da1f"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.821586 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.822998 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" event={"ID":"3e9cf3d8-0e7a-40ce-a32e-25e8700d8307","Type":"ContainerStarted","Data":"0aec86e084e8b6dab82152efc9faae399234d1b8385df56953adfc150b0fad95"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.823441 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.825490 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" event={"ID":"909207ba-552c-4172-94c1-2e660e86e568","Type":"ContainerStarted","Data":"1be2b53367abed21d0289aba530f905efb2c00b951522b9531fa4d4ba5521223"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.825608 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.827801 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" event={"ID":"3e418980-7ef9-45e8-8d7d-5bd866af0d26","Type":"ContainerStarted","Data":"270ebcb2cb04f452f5946135861fb11a05f76435304e713a183a62e779961b7a"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.827839 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.827851 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" event={"ID":"3e418980-7ef9-45e8-8d7d-5bd866af0d26","Type":"ContainerStarted","Data":"90176d0a0deb350b736ee14a7f702f0fd44e6f1e0953cf923d283af99e0338f8"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.830176 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" event={"ID":"7bda6fa0-aa4a-4e4b-9165-c50b976bdce7","Type":"ContainerStarted","Data":"3aad17c112ae689d91b1ab178ce03d919aeff25bc9c1c1eff1a91201e194d37c"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.830805 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.832331 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" event={"ID":"280ba6f4-ec1f-4afe-8a1a-db5f030495ae","Type":"ContainerStarted","Data":"7f39d78497175fcd850a953e6ca13f234ab417cf8c7a9583c73c5dc70d7e8456"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.832797 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.834347 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" event={"ID":"6bc7e757-dd1c-4334-b83e-c4bc73e96658","Type":"ContainerStarted","Data":"9d74710d6df364c997fffb2d4d75dbafc3ba9f297a5cf289011187e1c7fe23da"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.834776 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.836125 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" event={"ID":"e03ff9b9-1896-4759-8bf2-66cb9e1a5a55","Type":"ContainerStarted","Data":"d60e8044defec343bd0a973f8f61fa381f4517da259876228c92a5aec79a719d"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.836586 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.838572 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" event={"ID":"9323399b-7f22-41aa-9db7-99772d8d6719","Type":"ContainerStarted","Data":"ab7b6a9dcb5dfe8711f85bd54f3040f6a20f277eb3d6f44af1cd5e44cf598f32"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.839019 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.844732 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" event={"ID":"1f01efe4-3233-44d1-8def-b5080cc50aac","Type":"ContainerStarted","Data":"cb6ac0d8c8cd3d841de342b0c5bf76ddc949c067269bacebf49488c18fe90017"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.844888 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" podStartSLOduration=3.9610307860000002 podStartE2EDuration="13.84486708s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:52.501987476 +0000 UTC m=+968.414958479" lastFinishedPulling="2025-10-08 18:27:02.38582377 +0000 UTC m=+978.298794773" observedRunningTime="2025-10-08 18:27:03.784072113 +0000 UTC m=+979.697043126" watchObservedRunningTime="2025-10-08 18:27:04.84486708 +0000 UTC m=+980.757838083" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.845258 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.845888 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" podStartSLOduration=4.896122718 podStartE2EDuration="13.845883694s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.435133582 +0000 UTC m=+969.348104595" lastFinishedPulling="2025-10-08 18:27:02.384894518 +0000 UTC m=+978.297865571" observedRunningTime="2025-10-08 18:27:04.844689405 +0000 UTC m=+980.757660438" watchObservedRunningTime="2025-10-08 18:27:04.845883694 +0000 UTC m=+980.758854707" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.847013 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" event={"ID":"b41e76ff-471c-4874-88ab-66b6fae3a84a","Type":"ContainerStarted","Data":"603788c2ddd8bcc62a81cbbaf3df8e0cd76b7be13ca942fcffc843a791c8a9ff"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.849440 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" event={"ID":"18f1f857-f60c-4c47-b5c9-d60207e60ef5","Type":"ContainerStarted","Data":"370777fe196fc9553ed243de68246485e1543d32c540170c4eeb51cbbe9c0b13"} Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.874640 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" podStartSLOduration=4.90463943 podStartE2EDuration="13.87461926s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.432507839 +0000 UTC m=+969.345478852" lastFinishedPulling="2025-10-08 18:27:02.402487669 +0000 UTC m=+978.315458682" observedRunningTime="2025-10-08 18:27:04.87263069 +0000 UTC m=+980.785601703" watchObservedRunningTime="2025-10-08 18:27:04.87461926 +0000 UTC m=+980.787590273" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.903803 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" podStartSLOduration=4.135168496 podStartE2EDuration="13.903786705s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:52.634035694 +0000 UTC m=+968.547006707" lastFinishedPulling="2025-10-08 18:27:02.402653913 +0000 UTC m=+978.315624916" observedRunningTime="2025-10-08 18:27:04.89911491 +0000 UTC m=+980.812085933" watchObservedRunningTime="2025-10-08 18:27:04.903786705 +0000 UTC m=+980.816757718" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.926319 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" podStartSLOduration=4.4361978 podStartE2EDuration="13.926298927s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:52.948324524 +0000 UTC m=+968.861295537" lastFinishedPulling="2025-10-08 18:27:02.438425651 +0000 UTC m=+978.351396664" observedRunningTime="2025-10-08 18:27:04.919022229 +0000 UTC m=+980.831993232" watchObservedRunningTime="2025-10-08 18:27:04.926298927 +0000 UTC m=+980.839269940" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.954891 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" podStartSLOduration=5.457267734 podStartE2EDuration="13.954875138s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.93323497 +0000 UTC m=+969.846205983" lastFinishedPulling="2025-10-08 18:27:02.430842374 +0000 UTC m=+978.343813387" observedRunningTime="2025-10-08 18:27:04.949525906 +0000 UTC m=+980.862496919" watchObservedRunningTime="2025-10-08 18:27:04.954875138 +0000 UTC m=+980.867846151" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.975495 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" podStartSLOduration=4.499120316 podStartE2EDuration="13.975473803s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:52.944376949 +0000 UTC m=+968.857347962" lastFinishedPulling="2025-10-08 18:27:02.420730436 +0000 UTC m=+978.333701449" observedRunningTime="2025-10-08 18:27:04.970714596 +0000 UTC m=+980.883685609" watchObservedRunningTime="2025-10-08 18:27:04.975473803 +0000 UTC m=+980.888444806" Oct 08 18:27:04 crc kubenswrapper[4750]: I1008 18:27:04.989737 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" podStartSLOduration=4.940223928 podStartE2EDuration="13.989718682s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.400747406 +0000 UTC m=+969.313718419" lastFinishedPulling="2025-10-08 18:27:02.45024215 +0000 UTC m=+978.363213173" observedRunningTime="2025-10-08 18:27:04.984984827 +0000 UTC m=+980.897955840" watchObservedRunningTime="2025-10-08 18:27:04.989718682 +0000 UTC m=+980.902689695" Oct 08 18:27:05 crc kubenswrapper[4750]: I1008 18:27:05.005727 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" podStartSLOduration=5.033549366 podStartE2EDuration="14.005707214s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.458627695 +0000 UTC m=+969.371598708" lastFinishedPulling="2025-10-08 18:27:02.430785553 +0000 UTC m=+978.343756556" observedRunningTime="2025-10-08 18:27:05.000381004 +0000 UTC m=+980.913352027" watchObservedRunningTime="2025-10-08 18:27:05.005707214 +0000 UTC m=+980.918678227" Oct 08 18:27:05 crc kubenswrapper[4750]: I1008 18:27:05.014586 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" podStartSLOduration=5.007096573 podStartE2EDuration="14.014569362s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.432245493 +0000 UTC m=+969.345216496" lastFinishedPulling="2025-10-08 18:27:02.439718272 +0000 UTC m=+978.352689285" observedRunningTime="2025-10-08 18:27:05.013479055 +0000 UTC m=+980.926450088" watchObservedRunningTime="2025-10-08 18:27:05.014569362 +0000 UTC m=+980.927540395" Oct 08 18:27:05 crc kubenswrapper[4750]: I1008 18:27:05.049076 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" podStartSLOduration=5.032048869 podStartE2EDuration="14.049048187s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.403223246 +0000 UTC m=+969.316194259" lastFinishedPulling="2025-10-08 18:27:02.420222564 +0000 UTC m=+978.333193577" observedRunningTime="2025-10-08 18:27:05.035853374 +0000 UTC m=+980.948824387" watchObservedRunningTime="2025-10-08 18:27:05.049048187 +0000 UTC m=+980.962019200" Oct 08 18:27:05 crc kubenswrapper[4750]: I1008 18:27:05.060578 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" podStartSLOduration=5.043189757 podStartE2EDuration="14.06056114s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.432897838 +0000 UTC m=+969.345868851" lastFinishedPulling="2025-10-08 18:27:02.450269221 +0000 UTC m=+978.363240234" observedRunningTime="2025-10-08 18:27:05.057467403 +0000 UTC m=+980.970438416" watchObservedRunningTime="2025-10-08 18:27:05.06056114 +0000 UTC m=+980.973532153" Oct 08 18:27:05 crc kubenswrapper[4750]: I1008 18:27:05.080026 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" podStartSLOduration=5.128468907 podStartE2EDuration="14.080005807s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.433228335 +0000 UTC m=+969.346199348" lastFinishedPulling="2025-10-08 18:27:02.384765235 +0000 UTC m=+978.297736248" observedRunningTime="2025-10-08 18:27:05.079575036 +0000 UTC m=+980.992546069" watchObservedRunningTime="2025-10-08 18:27:05.080005807 +0000 UTC m=+980.992976810" Oct 08 18:27:05 crc kubenswrapper[4750]: I1008 18:27:05.095028 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" podStartSLOduration=4.602779073 podStartE2EDuration="14.095009654s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:52.913973679 +0000 UTC m=+968.826944692" lastFinishedPulling="2025-10-08 18:27:02.40620426 +0000 UTC m=+978.319175273" observedRunningTime="2025-10-08 18:27:05.094581024 +0000 UTC m=+981.007552037" watchObservedRunningTime="2025-10-08 18:27:05.095009654 +0000 UTC m=+981.007980667" Oct 08 18:27:05 crc kubenswrapper[4750]: I1008 18:27:05.856368 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" Oct 08 18:27:05 crc kubenswrapper[4750]: I1008 18:27:05.856405 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" Oct 08 18:27:11 crc kubenswrapper[4750]: I1008 18:27:11.589910 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vwnz5" Oct 08 18:27:11 crc kubenswrapper[4750]: I1008 18:27:11.661258 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-km9s8" Oct 08 18:27:11 crc kubenswrapper[4750]: I1008 18:27:11.700996 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-nm5np" Oct 08 18:27:11 crc kubenswrapper[4750]: I1008 18:27:11.805698 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-k6wbx" Oct 08 18:27:11 crc kubenswrapper[4750]: I1008 18:27:11.933040 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-2flss" Oct 08 18:27:11 crc kubenswrapper[4750]: I1008 18:27:11.949521 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-pcb44" Oct 08 18:27:11 crc kubenswrapper[4750]: I1008 18:27:11.978475 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-9wgqj" Oct 08 18:27:12 crc kubenswrapper[4750]: I1008 18:27:12.030736 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5df598886f-w4sqr" Oct 08 18:27:12 crc kubenswrapper[4750]: I1008 18:27:12.033391 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-s454n" Oct 08 18:27:12 crc kubenswrapper[4750]: I1008 18:27:12.054729 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-xhnm8" Oct 08 18:27:12 crc kubenswrapper[4750]: I1008 18:27:12.057318 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-f857v" Oct 08 18:27:12 crc kubenswrapper[4750]: I1008 18:27:12.079608 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-prrpn" Oct 08 18:27:12 crc kubenswrapper[4750]: I1008 18:27:12.101746 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-bs9x4" Oct 08 18:27:12 crc kubenswrapper[4750]: I1008 18:27:12.185617 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-kn2bm" Oct 08 18:27:12 crc kubenswrapper[4750]: I1008 18:27:12.233747 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-qnpbb" Oct 08 18:27:13 crc kubenswrapper[4750]: I1008 18:27:13.227726 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-drhrw" Oct 08 18:27:13 crc kubenswrapper[4750]: I1008 18:27:13.635951 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg" Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.935306 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" event={"ID":"96e0828f-b805-4aa0-b0fc-e64c3aeb6120","Type":"ContainerStarted","Data":"0ab83b1bfa3bca301edeb9a68e8dc7b7e79da147ae120f64ba8bf6e440ab09d8"} Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.935891 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.943164 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" event={"ID":"9b3555bd-2455-4844-a0a7-16ddfca85ce5","Type":"ContainerStarted","Data":"7c6b90fd52d798473ce413151f9f24084f50e6dad4ef3ebd6e208948a82fcacf"} Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.943741 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.945139 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" event={"ID":"bc8e10c4-5871-4909-af47-4b2c1c78d3be","Type":"ContainerStarted","Data":"b2c3165f76e913f38662f567587ca097f1d5c740b791676c414d2119915cfb21"} Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.947216 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" event={"ID":"2239d744-45d5-40ac-b30a-929c4edd8489","Type":"ContainerStarted","Data":"3d522fd6cf60550ce952c528cf7575a561c59d85026eb737041f6aed256c9b24"} Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.947430 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.949978 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" event={"ID":"07dc6c54-47cf-48e9-aed2-3016f69594de","Type":"ContainerStarted","Data":"87cd7a9cc1bb19c322e1dc91af43867cd64f1be575bef7afd810d5f9b5acbf13"} Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.950579 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.954892 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" podStartSLOduration=3.247117223 podStartE2EDuration="25.954873305s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.478353588 +0000 UTC m=+969.391324591" lastFinishedPulling="2025-10-08 18:27:16.18610964 +0000 UTC m=+992.099080673" observedRunningTime="2025-10-08 18:27:16.953680626 +0000 UTC m=+992.866651659" watchObservedRunningTime="2025-10-08 18:27:16.954873305 +0000 UTC m=+992.867844328" Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.973380 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" podStartSLOduration=3.2746841079999998 podStartE2EDuration="25.973359988s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.461384961 +0000 UTC m=+969.374355974" lastFinishedPulling="2025-10-08 18:27:16.160060841 +0000 UTC m=+992.073031854" observedRunningTime="2025-10-08 18:27:16.970541339 +0000 UTC m=+992.883512352" watchObservedRunningTime="2025-10-08 18:27:16.973359988 +0000 UTC m=+992.886331001" Oct 08 18:27:16 crc kubenswrapper[4750]: I1008 18:27:16.988751 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" podStartSLOduration=3.32306254 podStartE2EDuration="25.988732865s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.490248044 +0000 UTC m=+969.403219057" lastFinishedPulling="2025-10-08 18:27:16.155918369 +0000 UTC m=+992.068889382" observedRunningTime="2025-10-08 18:27:16.986775928 +0000 UTC m=+992.899746951" watchObservedRunningTime="2025-10-08 18:27:16.988732865 +0000 UTC m=+992.901703878" Oct 08 18:27:17 crc kubenswrapper[4750]: I1008 18:27:17.003514 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" podStartSLOduration=3.337704878 podStartE2EDuration="26.003491027s" podCreationTimestamp="2025-10-08 18:26:51 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.490192492 +0000 UTC m=+969.403163495" lastFinishedPulling="2025-10-08 18:27:16.155978631 +0000 UTC m=+992.068949644" observedRunningTime="2025-10-08 18:27:16.999710115 +0000 UTC m=+992.912681128" watchObservedRunningTime="2025-10-08 18:27:17.003491027 +0000 UTC m=+992.916462060" Oct 08 18:27:17 crc kubenswrapper[4750]: I1008 18:27:17.027877 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk" podStartSLOduration=2.349618158 podStartE2EDuration="25.027859125s" podCreationTimestamp="2025-10-08 18:26:52 +0000 UTC" firstStartedPulling="2025-10-08 18:26:53.477892977 +0000 UTC m=+969.390863990" lastFinishedPulling="2025-10-08 18:27:16.156133944 +0000 UTC m=+992.069104957" observedRunningTime="2025-10-08 18:27:17.020567356 +0000 UTC m=+992.933538379" watchObservedRunningTime="2025-10-08 18:27:17.027859125 +0000 UTC m=+992.940830138" Oct 08 18:27:22 crc kubenswrapper[4750]: I1008 18:27:22.217917 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-k8v2z" Oct 08 18:27:22 crc kubenswrapper[4750]: I1008 18:27:22.251362 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76796d4c6b-vrwkf" Oct 08 18:27:22 crc kubenswrapper[4750]: I1008 18:27:22.286952 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56c698c775-jsq4j" Oct 08 18:27:22 crc kubenswrapper[4750]: I1008 18:27:22.300282 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7794bc6bd-7876q" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.192167 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-6c4pl"] Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.194972 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.198167 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.198324 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5wv5d" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.198347 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.200346 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.212332 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-6c4pl"] Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.242254 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-5s7sn"] Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.243582 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.247426 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.272775 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-5s7sn"] Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.357851 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs2rv\" (UniqueName: \"kubernetes.io/projected/7402f445-a72e-4d45-bad2-d27f14188109-kube-api-access-rs2rv\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.357903 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6f5de9-2c2f-4669-8449-e60e7e750821-config\") pod \"dnsmasq-dns-7bfcb9d745-6c4pl\" (UID: \"2b6f5de9-2c2f-4669-8449-e60e7e750821\") " pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.357954 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lll4q\" (UniqueName: \"kubernetes.io/projected/2b6f5de9-2c2f-4669-8449-e60e7e750821-kube-api-access-lll4q\") pod \"dnsmasq-dns-7bfcb9d745-6c4pl\" (UID: \"2b6f5de9-2c2f-4669-8449-e60e7e750821\") " pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.357973 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-config\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.358011 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-dns-svc\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.459414 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs2rv\" (UniqueName: \"kubernetes.io/projected/7402f445-a72e-4d45-bad2-d27f14188109-kube-api-access-rs2rv\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.459459 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6f5de9-2c2f-4669-8449-e60e7e750821-config\") pod \"dnsmasq-dns-7bfcb9d745-6c4pl\" (UID: \"2b6f5de9-2c2f-4669-8449-e60e7e750821\") " pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.459496 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-config\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.459512 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lll4q\" (UniqueName: \"kubernetes.io/projected/2b6f5de9-2c2f-4669-8449-e60e7e750821-kube-api-access-lll4q\") pod \"dnsmasq-dns-7bfcb9d745-6c4pl\" (UID: \"2b6f5de9-2c2f-4669-8449-e60e7e750821\") " pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.459542 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-dns-svc\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.460363 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6f5de9-2c2f-4669-8449-e60e7e750821-config\") pod \"dnsmasq-dns-7bfcb9d745-6c4pl\" (UID: \"2b6f5de9-2c2f-4669-8449-e60e7e750821\") " pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.460662 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-config\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.461591 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-dns-svc\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.477585 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs2rv\" (UniqueName: \"kubernetes.io/projected/7402f445-a72e-4d45-bad2-d27f14188109-kube-api-access-rs2rv\") pod \"dnsmasq-dns-758b79db4c-5s7sn\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.478752 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lll4q\" (UniqueName: \"kubernetes.io/projected/2b6f5de9-2c2f-4669-8449-e60e7e750821-kube-api-access-lll4q\") pod \"dnsmasq-dns-7bfcb9d745-6c4pl\" (UID: \"2b6f5de9-2c2f-4669-8449-e60e7e750821\") " pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.536317 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.568362 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:27:37 crc kubenswrapper[4750]: W1008 18:27:37.979873 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b6f5de9_2c2f_4669_8449_e60e7e750821.slice/crio-cd6850e9ae80ddabb6842fb8f08f31d87e67e4f52ae04d7f75aaa78bc0385454 WatchSource:0}: Error finding container cd6850e9ae80ddabb6842fb8f08f31d87e67e4f52ae04d7f75aaa78bc0385454: Status 404 returned error can't find the container with id cd6850e9ae80ddabb6842fb8f08f31d87e67e4f52ae04d7f75aaa78bc0385454 Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.980044 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-6c4pl"] Oct 08 18:27:37 crc kubenswrapper[4750]: I1008 18:27:37.982116 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.025489 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-5s7sn"] Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.094506 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" event={"ID":"2b6f5de9-2c2f-4669-8449-e60e7e750821","Type":"ContainerStarted","Data":"cd6850e9ae80ddabb6842fb8f08f31d87e67e4f52ae04d7f75aaa78bc0385454"} Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.096009 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" event={"ID":"7402f445-a72e-4d45-bad2-d27f14188109","Type":"ContainerStarted","Data":"0a1ff67b2191fc1bd8210a8a2806885acc439ccb6be5cc6b50351120a604e870"} Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.459990 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-5s7sn"] Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.487148 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644597f84c-58wp9"] Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.488279 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.509379 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-58wp9"] Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.675350 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zbwj\" (UniqueName: \"kubernetes.io/projected/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-kube-api-access-4zbwj\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.675587 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-config\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.675662 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-dns-svc\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.777268 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zbwj\" (UniqueName: \"kubernetes.io/projected/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-kube-api-access-4zbwj\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.777341 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-config\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.777360 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-dns-svc\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.778357 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-dns-svc\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.778455 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-config\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.800352 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zbwj\" (UniqueName: \"kubernetes.io/projected/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-kube-api-access-4zbwj\") pod \"dnsmasq-dns-644597f84c-58wp9\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:38 crc kubenswrapper[4750]: I1008 18:27:38.826392 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.314132 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-58wp9"] Oct 08 18:27:39 crc kubenswrapper[4750]: W1008 18:27:39.354632 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdfa2238_e6b5_40ce_9945_ce12691e2ef6.slice/crio-c251e924ff9ae38776bcc49467ef425a8b1c0255a1e3cdd0a4e6a62ac643cc9c WatchSource:0}: Error finding container c251e924ff9ae38776bcc49467ef425a8b1c0255a1e3cdd0a4e6a62ac643cc9c: Status 404 returned error can't find the container with id c251e924ff9ae38776bcc49467ef425a8b1c0255a1e3cdd0a4e6a62ac643cc9c Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.405626 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-6c4pl"] Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.423880 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77597f887-jncg2"] Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.432435 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jncg2"] Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.432565 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.594504 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-config\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.594633 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-dns-svc\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.594813 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xs8q\" (UniqueName: \"kubernetes.io/projected/982c6d04-b346-4001-b6e5-2772413a1172-kube-api-access-9xs8q\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.651358 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.652522 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.657795 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.657925 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.658020 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.663275 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.663719 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.670944 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.671158 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-84knr" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.672981 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.697462 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-config\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.697523 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-dns-svc\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.697582 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xs8q\" (UniqueName: \"kubernetes.io/projected/982c6d04-b346-4001-b6e5-2772413a1172-kube-api-access-9xs8q\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.700214 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-config\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.700943 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-dns-svc\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.720733 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xs8q\" (UniqueName: \"kubernetes.io/projected/982c6d04-b346-4001-b6e5-2772413a1172-kube-api-access-9xs8q\") pod \"dnsmasq-dns-77597f887-jncg2\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.758677 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799056 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799142 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b8108eb-834c-44bd-9f39-70c348388ab6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799174 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799198 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799225 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799249 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b8108eb-834c-44bd-9f39-70c348388ab6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799265 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799289 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twzdz\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-kube-api-access-twzdz\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799322 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799338 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.799352 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.900677 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b8108eb-834c-44bd-9f39-70c348388ab6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901034 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901069 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901097 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901134 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b8108eb-834c-44bd-9f39-70c348388ab6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901152 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901178 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twzdz\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-kube-api-access-twzdz\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901208 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901223 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901238 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.901310 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.902139 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.903067 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.903404 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.903510 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.904084 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.906442 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.908476 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.927686 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b8108eb-834c-44bd-9f39-70c348388ab6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.928261 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b8108eb-834c-44bd-9f39-70c348388ab6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.938263 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twzdz\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-kube-api-access-twzdz\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.938905 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.958491 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " pod="openstack/rabbitmq-server-0" Oct 08 18:27:39 crc kubenswrapper[4750]: I1008 18:27:39.980152 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.122286 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-58wp9" event={"ID":"fdfa2238-e6b5-40ce-9945-ce12691e2ef6","Type":"ContainerStarted","Data":"c251e924ff9ae38776bcc49467ef425a8b1c0255a1e3cdd0a4e6a62ac643cc9c"} Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.216228 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jncg2"] Oct 08 18:27:40 crc kubenswrapper[4750]: W1008 18:27:40.223853 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod982c6d04_b346_4001_b6e5_2772413a1172.slice/crio-44c61f97ff03820185af3b1e8275235dd51895bec84254bd2bf912a5331af6ce WatchSource:0}: Error finding container 44c61f97ff03820185af3b1e8275235dd51895bec84254bd2bf912a5331af6ce: Status 404 returned error can't find the container with id 44c61f97ff03820185af3b1e8275235dd51895bec84254bd2bf912a5331af6ce Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.522065 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.585893 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.587018 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.592728 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.599937 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.601854 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.602043 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7lftn" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.602140 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.602246 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.609306 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.609541 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.739804 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43a52313-747b-40a7-a7e0-9e18f3c97c42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.739861 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.739890 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.739963 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.740011 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.740164 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npbvh\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-kube-api-access-npbvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.740323 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.740412 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.740445 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.740463 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43a52313-747b-40a7-a7e0-9e18f3c97c42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.740502 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842326 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842393 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842410 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842433 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43a52313-747b-40a7-a7e0-9e18f3c97c42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842454 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842484 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43a52313-747b-40a7-a7e0-9e18f3c97c42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842505 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842518 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842537 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842584 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.842612 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npbvh\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-kube-api-access-npbvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.843928 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.845825 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.846083 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.846225 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.846761 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.847126 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.859348 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.859470 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43a52313-747b-40a7-a7e0-9e18f3c97c42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.859697 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.859899 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43a52313-747b-40a7-a7e0-9e18f3c97c42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.863236 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npbvh\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-kube-api-access-npbvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.877259 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:40 crc kubenswrapper[4750]: I1008 18:27:40.933079 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:27:41 crc kubenswrapper[4750]: I1008 18:27:41.139360 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b8108eb-834c-44bd-9f39-70c348388ab6","Type":"ContainerStarted","Data":"c6f0fed50a39031c63702b524b07cd549afac88921eb41415d16e3fcb1af8b96"} Oct 08 18:27:41 crc kubenswrapper[4750]: I1008 18:27:41.141872 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jncg2" event={"ID":"982c6d04-b346-4001-b6e5-2772413a1172","Type":"ContainerStarted","Data":"44c61f97ff03820185af3b1e8275235dd51895bec84254bd2bf912a5331af6ce"} Oct 08 18:27:41 crc kubenswrapper[4750]: I1008 18:27:41.512573 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 18:27:41 crc kubenswrapper[4750]: W1008 18:27:41.516981 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43a52313_747b_40a7_a7e0_9e18f3c97c42.slice/crio-4d0c1296afba7cd7122b6bd1c17e5e3380f8de83260c07af67c846218575f097 WatchSource:0}: Error finding container 4d0c1296afba7cd7122b6bd1c17e5e3380f8de83260c07af67c846218575f097: Status 404 returned error can't find the container with id 4d0c1296afba7cd7122b6bd1c17e5e3380f8de83260c07af67c846218575f097 Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.155964 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.157229 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.161520 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-sh4ds" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.166676 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.166896 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.166999 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.169203 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.170290 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.176745 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43a52313-747b-40a7-a7e0-9e18f3c97c42","Type":"ContainerStarted","Data":"4d0c1296afba7cd7122b6bd1c17e5e3380f8de83260c07af67c846218575f097"} Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.192877 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.271690 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.271749 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssp5w\" (UniqueName: \"kubernetes.io/projected/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kube-api-access-ssp5w\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.271793 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.271870 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kolla-config\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.271898 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.271941 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-default\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.271971 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.272221 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.272383 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-secrets\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376386 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376449 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-default\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376472 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376496 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376515 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-secrets\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376579 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376622 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssp5w\" (UniqueName: \"kubernetes.io/projected/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kube-api-access-ssp5w\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376645 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.376682 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kolla-config\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.377426 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kolla-config\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.377707 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.377856 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.378018 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-default\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.379108 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.383183 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-secrets\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.386402 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.387133 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.404488 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssp5w\" (UniqueName: \"kubernetes.io/projected/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kube-api-access-ssp5w\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.430453 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.501182 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 18:27:42 crc kubenswrapper[4750]: I1008 18:27:42.925215 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 18:27:42 crc kubenswrapper[4750]: W1008 18:27:42.987107 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcedba103_c6cd_4e9a_9c7c_80d90aaedb3b.slice/crio-04bcda7d410b5c881619148884f3f448e73cc4d0d377e86bde6b8a44712dd0c0 WatchSource:0}: Error finding container 04bcda7d410b5c881619148884f3f448e73cc4d0d377e86bde6b8a44712dd0c0: Status 404 returned error can't find the container with id 04bcda7d410b5c881619148884f3f448e73cc4d0d377e86bde6b8a44712dd0c0 Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.189781 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b","Type":"ContainerStarted","Data":"04bcda7d410b5c881619148884f3f448e73cc4d0d377e86bde6b8a44712dd0c0"} Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.196532 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.201914 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.206232 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.206460 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dtmkn" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.207368 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.207529 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.211236 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.291957 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.291999 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.292057 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.292115 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.292268 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.292403 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.292492 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsnsq\" (UniqueName: \"kubernetes.io/projected/2feb2439-d911-4585-a5e1-671abcfa357d-kube-api-access-vsnsq\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.292519 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.292595 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.393937 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.393998 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394071 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394101 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsnsq\" (UniqueName: \"kubernetes.io/projected/2feb2439-d911-4585-a5e1-671abcfa357d-kube-api-access-vsnsq\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394122 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394157 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394185 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394202 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394222 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394292 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.394789 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.397422 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.399146 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.400012 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.406263 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.414282 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.414632 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsnsq\" (UniqueName: \"kubernetes.io/projected/2feb2439-d911-4585-a5e1-671abcfa357d-kube-api-access-vsnsq\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.423998 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.441228 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.561689 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.658140 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.670007 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.673454 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.673760 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.673901 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-sc25z" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.679181 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.702206 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.702370 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kolla-config\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.702410 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-config-data\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.702433 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mfw4\" (UniqueName: \"kubernetes.io/projected/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kube-api-access-2mfw4\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.702468 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.803594 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kolla-config\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.803884 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-config-data\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.803902 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mfw4\" (UniqueName: \"kubernetes.io/projected/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kube-api-access-2mfw4\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.803924 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.803946 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.804482 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kolla-config\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.805389 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-config-data\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.814507 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.815313 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:43 crc kubenswrapper[4750]: I1008 18:27:43.838249 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mfw4\" (UniqueName: \"kubernetes.io/projected/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kube-api-access-2mfw4\") pod \"memcached-0\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " pod="openstack/memcached-0" Oct 08 18:27:44 crc kubenswrapper[4750]: I1008 18:27:44.008102 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 18:27:45 crc kubenswrapper[4750]: I1008 18:27:45.383980 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:27:45 crc kubenswrapper[4750]: I1008 18:27:45.385321 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 18:27:45 crc kubenswrapper[4750]: I1008 18:27:45.388734 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lxfz2" Oct 08 18:27:45 crc kubenswrapper[4750]: I1008 18:27:45.396247 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:27:45 crc kubenswrapper[4750]: I1008 18:27:45.430761 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2tm\" (UniqueName: \"kubernetes.io/projected/9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e-kube-api-access-wz2tm\") pod \"kube-state-metrics-0\" (UID: \"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e\") " pod="openstack/kube-state-metrics-0" Oct 08 18:27:45 crc kubenswrapper[4750]: I1008 18:27:45.531587 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2tm\" (UniqueName: \"kubernetes.io/projected/9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e-kube-api-access-wz2tm\") pod \"kube-state-metrics-0\" (UID: \"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e\") " pod="openstack/kube-state-metrics-0" Oct 08 18:27:45 crc kubenswrapper[4750]: I1008 18:27:45.554747 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2tm\" (UniqueName: \"kubernetes.io/projected/9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e-kube-api-access-wz2tm\") pod \"kube-state-metrics-0\" (UID: \"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e\") " pod="openstack/kube-state-metrics-0" Oct 08 18:27:45 crc kubenswrapper[4750]: I1008 18:27:45.710117 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.100467 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mkxdr"] Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.102936 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.105780 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.106180 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.107010 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-wrf76" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.109948 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mkxdr"] Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.158047 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-57vgx"] Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.159969 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.172112 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-57vgx"] Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208292 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvpf8\" (UniqueName: \"kubernetes.io/projected/e6709646-0141-474b-b73f-6f451e77f602-kube-api-access-zvpf8\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208359 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc808a1a-9703-4009-8d81-e555a8e25929-scripts\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208398 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-log-ovn\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208495 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqlx\" (UniqueName: \"kubernetes.io/projected/cc808a1a-9703-4009-8d81-e555a8e25929-kube-api-access-pfqlx\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208569 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-ovn-controller-tls-certs\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208644 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6709646-0141-474b-b73f-6f451e77f602-scripts\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208675 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-log\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208737 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208838 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-run\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208880 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-lib\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.208999 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run-ovn\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.209030 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-combined-ca-bundle\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.209091 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-etc-ovs\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.310803 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-lib\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.310891 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run-ovn\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.310914 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-combined-ca-bundle\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.310944 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-etc-ovs\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.310984 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvpf8\" (UniqueName: \"kubernetes.io/projected/e6709646-0141-474b-b73f-6f451e77f602-kube-api-access-zvpf8\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311020 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc808a1a-9703-4009-8d81-e555a8e25929-scripts\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311044 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-log-ovn\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311066 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqlx\" (UniqueName: \"kubernetes.io/projected/cc808a1a-9703-4009-8d81-e555a8e25929-kube-api-access-pfqlx\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311094 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-ovn-controller-tls-certs\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311130 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-log\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311149 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6709646-0141-474b-b73f-6f451e77f602-scripts\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311185 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311222 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-run\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311577 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-etc-ovs\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.311938 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-log\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.312030 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-log-ovn\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.312027 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run-ovn\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.312173 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.312229 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-run\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.314539 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6709646-0141-474b-b73f-6f451e77f602-scripts\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.314611 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-lib\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.315166 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc808a1a-9703-4009-8d81-e555a8e25929-scripts\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.319202 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-combined-ca-bundle\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.320954 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-ovn-controller-tls-certs\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.331147 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvpf8\" (UniqueName: \"kubernetes.io/projected/e6709646-0141-474b-b73f-6f451e77f602-kube-api-access-zvpf8\") pod \"ovn-controller-ovs-57vgx\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.337401 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqlx\" (UniqueName: \"kubernetes.io/projected/cc808a1a-9703-4009-8d81-e555a8e25929-kube-api-access-pfqlx\") pod \"ovn-controller-mkxdr\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.426660 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mkxdr" Oct 08 18:27:50 crc kubenswrapper[4750]: I1008 18:27:50.479957 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.478735 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.480324 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.483287 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.483644 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.484526 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.484603 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.484603 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-gl267" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.490376 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.551737 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.551792 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.551828 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.551940 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-config\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.552056 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.552105 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.552133 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590d851e-4648-48db-b385-aaa732f5c787-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.552157 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkkr\" (UniqueName: \"kubernetes.io/projected/590d851e-4648-48db-b385-aaa732f5c787-kube-api-access-6hkkr\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.654061 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.654362 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.654418 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-config\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.654482 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.654807 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.655583 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.655651 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590d851e-4648-48db-b385-aaa732f5c787-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.655584 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-config\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.655705 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkkr\" (UniqueName: \"kubernetes.io/projected/590d851e-4648-48db-b385-aaa732f5c787-kube-api-access-6hkkr\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.655996 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590d851e-4648-48db-b385-aaa732f5c787-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.656287 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.657318 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.658344 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.658678 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.658685 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.682592 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.686763 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkkr\" (UniqueName: \"kubernetes.io/projected/590d851e-4648-48db-b385-aaa732f5c787-kube-api-access-6hkkr\") pod \"ovsdbserver-sb-0\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:52 crc kubenswrapper[4750]: I1008 18:27:52.799164 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.254331 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.255915 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.263391 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.263512 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.263877 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.264000 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jv754" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.266279 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.368891 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.368942 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.368975 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.369180 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.369300 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.369351 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.369382 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-config\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.369419 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn85c\" (UniqueName: \"kubernetes.io/projected/235930b1-672c-4fc6-bbb4-78204c591aee-kube-api-access-jn85c\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470453 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470585 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470640 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470663 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470685 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-config\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470715 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn85c\" (UniqueName: \"kubernetes.io/projected/235930b1-672c-4fc6-bbb4-78204c591aee-kube-api-access-jn85c\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470755 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470780 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.470986 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.472403 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.472714 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.473398 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-config\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.476792 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.478216 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.491280 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.493244 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn85c\" (UniqueName: \"kubernetes.io/projected/235930b1-672c-4fc6-bbb4-78204c591aee-kube-api-access-jn85c\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.504781 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " pod="openstack/ovsdbserver-nb-0" Oct 08 18:27:53 crc kubenswrapper[4750]: I1008 18:27:53.582384 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 18:28:03 crc kubenswrapper[4750]: E1008 18:28:03.464297 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:56b75d97f4a48c8cf58b3a7c18c43618efb308bf0188124f6301142e61299b0c" Oct 08 18:28:03 crc kubenswrapper[4750]: E1008 18:28:03.464866 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:56b75d97f4a48c8cf58b3a7c18c43618efb308bf0188124f6301142e61299b0c,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ssp5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(cedba103-c6cd-4e9a-9c7c-80d90aaedb3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 18:28:03 crc kubenswrapper[4750]: E1008 18:28:03.466107 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" Oct 08 18:28:04 crc kubenswrapper[4750]: E1008 18:28:04.378600 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:56b75d97f4a48c8cf58b3a7c18c43618efb308bf0188124f6301142e61299b0c\\\"\"" pod="openstack/openstack-galera-0" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" Oct 08 18:28:14 crc kubenswrapper[4750]: I1008 18:28:14.148327 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:28:14 crc kubenswrapper[4750]: I1008 18:28:14.221350 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 18:28:14 crc kubenswrapper[4750]: I1008 18:28:14.458992 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ed7b2661-13bd-4ab4-a92d-7bc382cd257e","Type":"ContainerStarted","Data":"126b5ab36bd9d3388c9838ad02869649f2369955b2465b5437361ef61c300edb"} Oct 08 18:28:14 crc kubenswrapper[4750]: I1008 18:28:14.461187 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e","Type":"ContainerStarted","Data":"92c19aa0fc4019d8868e8a01e9e8729c234e58647563aa88315e7b75a63d334c"} Oct 08 18:28:14 crc kubenswrapper[4750]: I1008 18:28:14.504780 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 18:28:14 crc kubenswrapper[4750]: W1008 18:28:14.511197 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2feb2439_d911_4585_a5e1_671abcfa357d.slice/crio-7c86a532bcb761c12dc472052ed8995adaafbe37a87f507cc02f082ef77bb416 WatchSource:0}: Error finding container 7c86a532bcb761c12dc472052ed8995adaafbe37a87f507cc02f082ef77bb416: Status 404 returned error can't find the container with id 7c86a532bcb761c12dc472052ed8995adaafbe37a87f507cc02f082ef77bb416 Oct 08 18:28:14 crc kubenswrapper[4750]: E1008 18:28:14.529234 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 18:28:14 crc kubenswrapper[4750]: E1008 18:28:14.529414 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rs2rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-758b79db4c-5s7sn_openstack(7402f445-a72e-4d45-bad2-d27f14188109): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 18:28:14 crc kubenswrapper[4750]: E1008 18:28:14.530581 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" podUID="7402f445-a72e-4d45-bad2-d27f14188109" Oct 08 18:28:14 crc kubenswrapper[4750]: I1008 18:28:14.610037 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mkxdr"] Oct 08 18:28:14 crc kubenswrapper[4750]: W1008 18:28:14.646993 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc808a1a_9703_4009_8d81_e555a8e25929.slice/crio-4a772474d6df97cd4f6099df8b8304bbd40360378e40889137f79e4742eb9c2d WatchSource:0}: Error finding container 4a772474d6df97cd4f6099df8b8304bbd40360378e40889137f79e4742eb9c2d: Status 404 returned error can't find the container with id 4a772474d6df97cd4f6099df8b8304bbd40360378e40889137f79e4742eb9c2d Oct 08 18:28:14 crc kubenswrapper[4750]: I1008 18:28:14.724164 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-57vgx"] Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.074728 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.074900 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lll4q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bfcb9d745-6c4pl_openstack(2b6f5de9-2c2f-4669-8449-e60e7e750821): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.076134 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" podUID="2b6f5de9-2c2f-4669-8449-e60e7e750821" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.241277 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.365419 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.365604 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4zbwj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-644597f84c-58wp9_openstack(fdfa2238-e6b5-40ce-9945-ce12691e2ef6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.367092 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-644597f84c-58wp9" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.407135 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.407341 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9xs8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77597f887-jncg2_openstack(982c6d04-b346-4001-b6e5-2772413a1172): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.408652 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-77597f887-jncg2" podUID="982c6d04-b346-4001-b6e5-2772413a1172" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.470449 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2feb2439-d911-4585-a5e1-671abcfa357d","Type":"ContainerStarted","Data":"7c86a532bcb761c12dc472052ed8995adaafbe37a87f507cc02f082ef77bb416"} Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.472883 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"590d851e-4648-48db-b385-aaa732f5c787","Type":"ContainerStarted","Data":"b2f0fc66c91c7f6856410a2b65f196da6d53142b0da13deab6cd17b27c2b1da2"} Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.474476 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mkxdr" event={"ID":"cc808a1a-9703-4009-8d81-e555a8e25929","Type":"ContainerStarted","Data":"4a772474d6df97cd4f6099df8b8304bbd40360378e40889137f79e4742eb9c2d"} Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.475433 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-57vgx" event={"ID":"e6709646-0141-474b-b73f-6f451e77f602","Type":"ContainerStarted","Data":"f34e50a0f4606db14bbae944ca5a2cd7ca2937bb8ebda711d0061b299aca8ac6"} Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.479813 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43a52313-747b-40a7-a7e0-9e18f3c97c42","Type":"ContainerStarted","Data":"fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565"} Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.482486 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b8108eb-834c-44bd-9f39-70c348388ab6","Type":"ContainerStarted","Data":"40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2"} Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.483172 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df\\\"\"" pod="openstack/dnsmasq-dns-77597f887-jncg2" podUID="982c6d04-b346-4001-b6e5-2772413a1172" Oct 08 18:28:15 crc kubenswrapper[4750]: E1008 18:28:15.483846 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df\\\"\"" pod="openstack/dnsmasq-dns-644597f84c-58wp9" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.612418 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.869573 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.922665 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.937444 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-dns-svc\") pod \"7402f445-a72e-4d45-bad2-d27f14188109\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.937529 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs2rv\" (UniqueName: \"kubernetes.io/projected/7402f445-a72e-4d45-bad2-d27f14188109-kube-api-access-rs2rv\") pod \"7402f445-a72e-4d45-bad2-d27f14188109\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.937580 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6f5de9-2c2f-4669-8449-e60e7e750821-config\") pod \"2b6f5de9-2c2f-4669-8449-e60e7e750821\" (UID: \"2b6f5de9-2c2f-4669-8449-e60e7e750821\") " Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.937613 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lll4q\" (UniqueName: \"kubernetes.io/projected/2b6f5de9-2c2f-4669-8449-e60e7e750821-kube-api-access-lll4q\") pod \"2b6f5de9-2c2f-4669-8449-e60e7e750821\" (UID: \"2b6f5de9-2c2f-4669-8449-e60e7e750821\") " Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.937649 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-config\") pod \"7402f445-a72e-4d45-bad2-d27f14188109\" (UID: \"7402f445-a72e-4d45-bad2-d27f14188109\") " Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.938204 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7402f445-a72e-4d45-bad2-d27f14188109" (UID: "7402f445-a72e-4d45-bad2-d27f14188109"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.938285 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6f5de9-2c2f-4669-8449-e60e7e750821-config" (OuterVolumeSpecName: "config") pod "2b6f5de9-2c2f-4669-8449-e60e7e750821" (UID: "2b6f5de9-2c2f-4669-8449-e60e7e750821"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.938453 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-config" (OuterVolumeSpecName: "config") pod "7402f445-a72e-4d45-bad2-d27f14188109" (UID: "7402f445-a72e-4d45-bad2-d27f14188109"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.949743 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7402f445-a72e-4d45-bad2-d27f14188109-kube-api-access-rs2rv" (OuterVolumeSpecName: "kube-api-access-rs2rv") pod "7402f445-a72e-4d45-bad2-d27f14188109" (UID: "7402f445-a72e-4d45-bad2-d27f14188109"). InnerVolumeSpecName "kube-api-access-rs2rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:15 crc kubenswrapper[4750]: I1008 18:28:15.951363 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6f5de9-2c2f-4669-8449-e60e7e750821-kube-api-access-lll4q" (OuterVolumeSpecName: "kube-api-access-lll4q") pod "2b6f5de9-2c2f-4669-8449-e60e7e750821" (UID: "2b6f5de9-2c2f-4669-8449-e60e7e750821"). InnerVolumeSpecName "kube-api-access-lll4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.039434 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.039470 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs2rv\" (UniqueName: \"kubernetes.io/projected/7402f445-a72e-4d45-bad2-d27f14188109-kube-api-access-rs2rv\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.039481 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6f5de9-2c2f-4669-8449-e60e7e750821-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.039491 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lll4q\" (UniqueName: \"kubernetes.io/projected/2b6f5de9-2c2f-4669-8449-e60e7e750821-kube-api-access-lll4q\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.039503 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7402f445-a72e-4d45-bad2-d27f14188109-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.489290 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"235930b1-672c-4fc6-bbb4-78204c591aee","Type":"ContainerStarted","Data":"e3299d677ec95752fdeedd1f2933d796db4410437bbff6b75df7cd49984a5ba6"} Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.490215 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" event={"ID":"7402f445-a72e-4d45-bad2-d27f14188109","Type":"ContainerDied","Data":"0a1ff67b2191fc1bd8210a8a2806885acc439ccb6be5cc6b50351120a604e870"} Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.490270 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-5s7sn" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.491968 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" event={"ID":"2b6f5de9-2c2f-4669-8449-e60e7e750821","Type":"ContainerDied","Data":"cd6850e9ae80ddabb6842fb8f08f31d87e67e4f52ae04d7f75aaa78bc0385454"} Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.491998 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-6c4pl" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.611716 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-5s7sn"] Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.617927 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-5s7sn"] Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.668669 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-6c4pl"] Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.698174 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-6c4pl"] Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.751257 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6f5de9-2c2f-4669-8449-e60e7e750821" path="/var/lib/kubelet/pods/2b6f5de9-2c2f-4669-8449-e60e7e750821/volumes" Oct 08 18:28:16 crc kubenswrapper[4750]: I1008 18:28:16.752071 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7402f445-a72e-4d45-bad2-d27f14188109" path="/var/lib/kubelet/pods/7402f445-a72e-4d45-bad2-d27f14188109/volumes" Oct 08 18:28:17 crc kubenswrapper[4750]: I1008 18:28:17.505563 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2feb2439-d911-4585-a5e1-671abcfa357d","Type":"ContainerStarted","Data":"076a806e0646bdccd9c8e5d177a4fb67634db132f346e97d11108bf680223bae"} Oct 08 18:28:21 crc kubenswrapper[4750]: I1008 18:28:21.536270 4750 generic.go:334] "Generic (PLEG): container finished" podID="2feb2439-d911-4585-a5e1-671abcfa357d" containerID="076a806e0646bdccd9c8e5d177a4fb67634db132f346e97d11108bf680223bae" exitCode=0 Oct 08 18:28:21 crc kubenswrapper[4750]: I1008 18:28:21.536389 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2feb2439-d911-4585-a5e1-671abcfa357d","Type":"ContainerDied","Data":"076a806e0646bdccd9c8e5d177a4fb67634db132f346e97d11108bf680223bae"} Oct 08 18:28:21 crc kubenswrapper[4750]: I1008 18:28:21.544418 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b","Type":"ContainerStarted","Data":"7486e6d02d87823ff44073ee25ac3d9ce6156c4080b1bc61d97efc65116f101a"} Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.555720 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e","Type":"ContainerStarted","Data":"aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49"} Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.556406 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.559030 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ed7b2661-13bd-4ab4-a92d-7bc382cd257e","Type":"ContainerStarted","Data":"c099682680f99d7d836a4647a357e3b0557699173c713817a786fb3c6c7a1d59"} Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.559121 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.561328 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"235930b1-672c-4fc6-bbb4-78204c591aee","Type":"ContainerStarted","Data":"e8ed3cdde57c4534decef125e57e81c8e9ddb189aeb9195d0d45761ded957615"} Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.564216 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2feb2439-d911-4585-a5e1-671abcfa357d","Type":"ContainerStarted","Data":"131e51f9cac5dc8a9c0d425746e3b9ee7a9d53950bcb5870bac43328508a3398"} Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.566416 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"590d851e-4648-48db-b385-aaa732f5c787","Type":"ContainerStarted","Data":"74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330"} Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.571114 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mkxdr" event={"ID":"cc808a1a-9703-4009-8d81-e555a8e25929","Type":"ContainerStarted","Data":"207cabb8cb0acf23e4bf4e62948733f47bedb3c82673a966db5aa8e4ed65d14b"} Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.571277 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mkxdr" Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.577085 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=30.025244054 podStartE2EDuration="37.577065556s" podCreationTimestamp="2025-10-08 18:27:45 +0000 UTC" firstStartedPulling="2025-10-08 18:28:14.189935027 +0000 UTC m=+1050.102906040" lastFinishedPulling="2025-10-08 18:28:21.741756529 +0000 UTC m=+1057.654727542" observedRunningTime="2025-10-08 18:28:22.576498423 +0000 UTC m=+1058.489469456" watchObservedRunningTime="2025-10-08 18:28:22.577065556 +0000 UTC m=+1058.490036579" Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.581340 4750 generic.go:334] "Generic (PLEG): container finished" podID="e6709646-0141-474b-b73f-6f451e77f602" containerID="3fb6163d60b7b44978ac474d176e064c6f7849a97dbf8bf4f4109430670dad8a" exitCode=0 Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.581398 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-57vgx" event={"ID":"e6709646-0141-474b-b73f-6f451e77f602","Type":"ContainerDied","Data":"3fb6163d60b7b44978ac474d176e064c6f7849a97dbf8bf4f4109430670dad8a"} Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.601852 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=38.552940094 podStartE2EDuration="40.601835414s" podCreationTimestamp="2025-10-08 18:27:42 +0000 UTC" firstStartedPulling="2025-10-08 18:28:14.513580825 +0000 UTC m=+1050.426551838" lastFinishedPulling="2025-10-08 18:28:16.562476145 +0000 UTC m=+1052.475447158" observedRunningTime="2025-10-08 18:28:22.596109513 +0000 UTC m=+1058.509080526" watchObservedRunningTime="2025-10-08 18:28:22.601835414 +0000 UTC m=+1058.514806417" Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.614925 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=33.618899069 podStartE2EDuration="39.61474661s" podCreationTimestamp="2025-10-08 18:27:43 +0000 UTC" firstStartedPulling="2025-10-08 18:28:14.257202327 +0000 UTC m=+1050.170173350" lastFinishedPulling="2025-10-08 18:28:20.253049878 +0000 UTC m=+1056.166020891" observedRunningTime="2025-10-08 18:28:22.614017362 +0000 UTC m=+1058.526988385" watchObservedRunningTime="2025-10-08 18:28:22.61474661 +0000 UTC m=+1058.527717633" Oct 08 18:28:22 crc kubenswrapper[4750]: I1008 18:28:22.636603 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mkxdr" podStartSLOduration=26.705809821 podStartE2EDuration="32.636582696s" podCreationTimestamp="2025-10-08 18:27:50 +0000 UTC" firstStartedPulling="2025-10-08 18:28:14.649906788 +0000 UTC m=+1050.562877801" lastFinishedPulling="2025-10-08 18:28:20.580679663 +0000 UTC m=+1056.493650676" observedRunningTime="2025-10-08 18:28:22.635901309 +0000 UTC m=+1058.548872322" watchObservedRunningTime="2025-10-08 18:28:22.636582696 +0000 UTC m=+1058.549553719" Oct 08 18:28:23 crc kubenswrapper[4750]: I1008 18:28:23.562057 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 08 18:28:23 crc kubenswrapper[4750]: I1008 18:28:23.562582 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 08 18:28:23 crc kubenswrapper[4750]: I1008 18:28:23.590786 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-57vgx" event={"ID":"e6709646-0141-474b-b73f-6f451e77f602","Type":"ContainerStarted","Data":"6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94"} Oct 08 18:28:23 crc kubenswrapper[4750]: I1008 18:28:23.590838 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-57vgx" event={"ID":"e6709646-0141-474b-b73f-6f451e77f602","Type":"ContainerStarted","Data":"afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2"} Oct 08 18:28:23 crc kubenswrapper[4750]: I1008 18:28:23.615455 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-57vgx" podStartSLOduration=27.76505036 podStartE2EDuration="33.615424303s" podCreationTimestamp="2025-10-08 18:27:50 +0000 UTC" firstStartedPulling="2025-10-08 18:28:14.73193273 +0000 UTC m=+1050.644903743" lastFinishedPulling="2025-10-08 18:28:20.582306673 +0000 UTC m=+1056.495277686" observedRunningTime="2025-10-08 18:28:23.609471126 +0000 UTC m=+1059.522442159" watchObservedRunningTime="2025-10-08 18:28:23.615424303 +0000 UTC m=+1059.528395316" Oct 08 18:28:24 crc kubenswrapper[4750]: I1008 18:28:24.600518 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:28:24 crc kubenswrapper[4750]: I1008 18:28:24.600937 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:28:25 crc kubenswrapper[4750]: I1008 18:28:25.609269 4750 generic.go:334] "Generic (PLEG): container finished" podID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerID="7486e6d02d87823ff44073ee25ac3d9ce6156c4080b1bc61d97efc65116f101a" exitCode=0 Oct 08 18:28:25 crc kubenswrapper[4750]: I1008 18:28:25.609373 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b","Type":"ContainerDied","Data":"7486e6d02d87823ff44073ee25ac3d9ce6156c4080b1bc61d97efc65116f101a"} Oct 08 18:28:26 crc kubenswrapper[4750]: I1008 18:28:26.618224 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"235930b1-672c-4fc6-bbb4-78204c591aee","Type":"ContainerStarted","Data":"d8bac502dbc380032817c12bba89e9f6cc302b44cbc72b0455a5c8be03b7c626"} Oct 08 18:28:26 crc kubenswrapper[4750]: I1008 18:28:26.620697 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b","Type":"ContainerStarted","Data":"238e771ab2028f72d92800deb7a9dbb1d35bb3e07296801734338f1c8ed278bd"} Oct 08 18:28:26 crc kubenswrapper[4750]: I1008 18:28:26.622989 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"590d851e-4648-48db-b385-aaa732f5c787","Type":"ContainerStarted","Data":"e646c1e4d2aa9c18afad260eee7d15c40c5f3fc3f863380210aa6106334f6c32"} Oct 08 18:28:26 crc kubenswrapper[4750]: I1008 18:28:26.647834 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=24.305246606 podStartE2EDuration="34.647818463s" podCreationTimestamp="2025-10-08 18:27:52 +0000 UTC" firstStartedPulling="2025-10-08 18:28:15.626340826 +0000 UTC m=+1051.539311839" lastFinishedPulling="2025-10-08 18:28:25.968912683 +0000 UTC m=+1061.881883696" observedRunningTime="2025-10-08 18:28:26.644491132 +0000 UTC m=+1062.557462155" watchObservedRunningTime="2025-10-08 18:28:26.647818463 +0000 UTC m=+1062.560789486" Oct 08 18:28:26 crc kubenswrapper[4750]: I1008 18:28:26.674222 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=24.938580813 podStartE2EDuration="35.67420038s" podCreationTimestamp="2025-10-08 18:27:51 +0000 UTC" firstStartedPulling="2025-10-08 18:28:15.246040119 +0000 UTC m=+1051.159011132" lastFinishedPulling="2025-10-08 18:28:25.981659656 +0000 UTC m=+1061.894630699" observedRunningTime="2025-10-08 18:28:26.668841199 +0000 UTC m=+1062.581812222" watchObservedRunningTime="2025-10-08 18:28:26.67420038 +0000 UTC m=+1062.587171403" Oct 08 18:28:26 crc kubenswrapper[4750]: I1008 18:28:26.697689 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371991.157114 podStartE2EDuration="45.697661326s" podCreationTimestamp="2025-10-08 18:27:41 +0000 UTC" firstStartedPulling="2025-10-08 18:27:43.013723847 +0000 UTC m=+1018.926694860" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:28:26.691403363 +0000 UTC m=+1062.604374386" watchObservedRunningTime="2025-10-08 18:28:26.697661326 +0000 UTC m=+1062.610632359" Oct 08 18:28:27 crc kubenswrapper[4750]: I1008 18:28:27.799715 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 08 18:28:28 crc kubenswrapper[4750]: I1008 18:28:28.582845 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 08 18:28:28 crc kubenswrapper[4750]: I1008 18:28:28.655622 4750 generic.go:334] "Generic (PLEG): container finished" podID="982c6d04-b346-4001-b6e5-2772413a1172" containerID="e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51" exitCode=0 Oct 08 18:28:28 crc kubenswrapper[4750]: I1008 18:28:28.655708 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jncg2" event={"ID":"982c6d04-b346-4001-b6e5-2772413a1172","Type":"ContainerDied","Data":"e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51"} Oct 08 18:28:28 crc kubenswrapper[4750]: I1008 18:28:28.657353 4750 generic.go:334] "Generic (PLEG): container finished" podID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" containerID="e546b39fe688cdced3980b827f6351a12a4ada1384eb69bdb13bcec1ec8a5a25" exitCode=0 Oct 08 18:28:28 crc kubenswrapper[4750]: I1008 18:28:28.658184 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-58wp9" event={"ID":"fdfa2238-e6b5-40ce-9945-ce12691e2ef6","Type":"ContainerDied","Data":"e546b39fe688cdced3980b827f6351a12a4ada1384eb69bdb13bcec1ec8a5a25"} Oct 08 18:28:28 crc kubenswrapper[4750]: I1008 18:28:28.799275 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 08 18:28:28 crc kubenswrapper[4750]: I1008 18:28:28.863493 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.009692 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.583141 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.629084 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.647494 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.667923 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jncg2" event={"ID":"982c6d04-b346-4001-b6e5-2772413a1172","Type":"ContainerStarted","Data":"862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0"} Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.668574 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.670697 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-58wp9" event={"ID":"fdfa2238-e6b5-40ce-9945-ce12691e2ef6","Type":"ContainerStarted","Data":"955dbceda4337a0b3969ade627b61b40ada5b3d27c0047027d9e9c4f1b629dd4"} Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.671611 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.699204 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-644597f84c-58wp9" podStartSLOduration=3.542452694 podStartE2EDuration="51.69918638s" podCreationTimestamp="2025-10-08 18:27:38 +0000 UTC" firstStartedPulling="2025-10-08 18:27:39.367523132 +0000 UTC m=+1015.280494135" lastFinishedPulling="2025-10-08 18:28:27.524256808 +0000 UTC m=+1063.437227821" observedRunningTime="2025-10-08 18:28:29.69101313 +0000 UTC m=+1065.603984153" watchObservedRunningTime="2025-10-08 18:28:29.69918638 +0000 UTC m=+1065.612157393" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.703645 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.712472 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77597f887-jncg2" podStartSLOduration=3.315622226 podStartE2EDuration="50.712457845s" podCreationTimestamp="2025-10-08 18:27:39 +0000 UTC" firstStartedPulling="2025-10-08 18:27:40.226863388 +0000 UTC m=+1016.139834391" lastFinishedPulling="2025-10-08 18:28:27.623698997 +0000 UTC m=+1063.536670010" observedRunningTime="2025-10-08 18:28:29.706986961 +0000 UTC m=+1065.619957974" watchObservedRunningTime="2025-10-08 18:28:29.712457845 +0000 UTC m=+1065.625428858" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.712851 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.714715 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.963719 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-58wp9"] Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.996632 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-ds2hc"] Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.997978 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:29 crc kubenswrapper[4750]: I1008 18:28:29.999708 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.009934 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-ds2hc"] Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.087922 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5qkq\" (UniqueName: \"kubernetes.io/projected/6c6270bb-d80a-45b1-bc51-44f1c920863e-kube-api-access-d5qkq\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.088000 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.088256 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-ovsdbserver-nb\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.088321 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-dns-svc\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.116025 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6ghsk"] Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.116957 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.119251 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.125155 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6ghsk"] Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.186685 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jncg2"] Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189699 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-ovsdbserver-nb\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189759 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-dns-svc\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189808 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbx7d\" (UniqueName: \"kubernetes.io/projected/7c3207de-78eb-41b2-a2be-163c9a3532af-kube-api-access-mbx7d\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189843 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5qkq\" (UniqueName: \"kubernetes.io/projected/6c6270bb-d80a-45b1-bc51-44f1c920863e-kube-api-access-d5qkq\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189871 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovs-rundir\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189901 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c3207de-78eb-41b2-a2be-163c9a3532af-config\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189930 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-combined-ca-bundle\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189946 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189963 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.189979 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovn-rundir\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.192191 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-ovsdbserver-nb\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.192865 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.193036 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-dns-svc\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.215920 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-5h7nt"] Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.220226 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.220395 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5qkq\" (UniqueName: \"kubernetes.io/projected/6c6270bb-d80a-45b1-bc51-44f1c920863e-kube-api-access-d5qkq\") pod \"dnsmasq-dns-f6b595d95-ds2hc\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.228846 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-5h7nt"] Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.230576 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.251035 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.252267 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.261244 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.263662 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.263912 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-h58vj" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.264072 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291101 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291155 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291179 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-config\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291222 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k445j\" (UniqueName: \"kubernetes.io/projected/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-kube-api-access-k445j\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291246 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-config\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291261 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlfz8\" (UniqueName: \"kubernetes.io/projected/230c02f8-af60-40d6-af19-adf730eec43f-kube-api-access-xlfz8\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291287 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbx7d\" (UniqueName: \"kubernetes.io/projected/7c3207de-78eb-41b2-a2be-163c9a3532af-kube-api-access-mbx7d\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291309 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/230c02f8-af60-40d6-af19-adf730eec43f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291348 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovs-rundir\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291366 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-scripts\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291384 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291397 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291418 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291438 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c3207de-78eb-41b2-a2be-163c9a3532af-config\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291452 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-combined-ca-bundle\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291468 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291492 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291513 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovn-rundir\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291836 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovn-rundir\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.291840 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovs-rundir\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.292399 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c3207de-78eb-41b2-a2be-163c9a3532af-config\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.295206 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-combined-ca-bundle\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.311345 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.315975 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbx7d\" (UniqueName: \"kubernetes.io/projected/7c3207de-78eb-41b2-a2be-163c9a3532af-kube-api-access-mbx7d\") pod \"ovn-controller-metrics-6ghsk\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.326354 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.355384 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392656 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392709 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392736 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-config\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392765 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k445j\" (UniqueName: \"kubernetes.io/projected/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-kube-api-access-k445j\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392788 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-config\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392803 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlfz8\" (UniqueName: \"kubernetes.io/projected/230c02f8-af60-40d6-af19-adf730eec43f-kube-api-access-xlfz8\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392828 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/230c02f8-af60-40d6-af19-adf730eec43f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392861 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-scripts\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392881 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392898 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392917 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.392937 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.393428 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/230c02f8-af60-40d6-af19-adf730eec43f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.394175 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-sb\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.394379 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-config\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.394646 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-scripts\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.395828 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-nb\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.396488 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-dns-svc\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.397256 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-config\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.397377 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.398506 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.406070 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.408824 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k445j\" (UniqueName: \"kubernetes.io/projected/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-kube-api-access-k445j\") pod \"dnsmasq-dns-dc9d58d7-5h7nt\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.409228 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlfz8\" (UniqueName: \"kubernetes.io/projected/230c02f8-af60-40d6-af19-adf730eec43f-kube-api-access-xlfz8\") pod \"ovn-northd-0\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.437690 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.574185 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.583974 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.789730 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-ds2hc"] Oct 08 18:28:30 crc kubenswrapper[4750]: W1008 18:28:30.795191 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c6270bb_d80a_45b1_bc51_44f1c920863e.slice/crio-ce0cde6782310a58dcb913b81a036cf7996e3f38221be3d762221681c99021fd WatchSource:0}: Error finding container ce0cde6782310a58dcb913b81a036cf7996e3f38221be3d762221681c99021fd: Status 404 returned error can't find the container with id ce0cde6782310a58dcb913b81a036cf7996e3f38221be3d762221681c99021fd Oct 08 18:28:30 crc kubenswrapper[4750]: I1008 18:28:30.900201 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6ghsk"] Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.037377 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-5h7nt"] Oct 08 18:28:31 crc kubenswrapper[4750]: W1008 18:28:31.079751 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd42d4ae_ae85_4f2b_bcc3_57df8300c5bd.slice/crio-05c7a8628fa174dfb5c12954721761f57d82a6f37af2c9ca2f2c3b10b11c8ffe WatchSource:0}: Error finding container 05c7a8628fa174dfb5c12954721761f57d82a6f37af2c9ca2f2c3b10b11c8ffe: Status 404 returned error can't find the container with id 05c7a8628fa174dfb5c12954721761f57d82a6f37af2c9ca2f2c3b10b11c8ffe Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.091661 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.685047 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"230c02f8-af60-40d6-af19-adf730eec43f","Type":"ContainerStarted","Data":"99f52a6ea71a16bfea8afdd0ee3f47f3836723b2c976614726111d0b57bb101f"} Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.687276 4750 generic.go:334] "Generic (PLEG): container finished" podID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" containerID="e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb" exitCode=0 Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.687319 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" event={"ID":"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd","Type":"ContainerDied","Data":"e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb"} Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.687335 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" event={"ID":"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd","Type":"ContainerStarted","Data":"05c7a8628fa174dfb5c12954721761f57d82a6f37af2c9ca2f2c3b10b11c8ffe"} Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.690218 4750 generic.go:334] "Generic (PLEG): container finished" podID="6c6270bb-d80a-45b1-bc51-44f1c920863e" containerID="3ab5a7fa33878a0fe750c66dbfd675bdb4e428f4b249383ff705b0ac29414e3b" exitCode=0 Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.690258 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" event={"ID":"6c6270bb-d80a-45b1-bc51-44f1c920863e","Type":"ContainerDied","Data":"3ab5a7fa33878a0fe750c66dbfd675bdb4e428f4b249383ff705b0ac29414e3b"} Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.690279 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" event={"ID":"6c6270bb-d80a-45b1-bc51-44f1c920863e","Type":"ContainerStarted","Data":"ce0cde6782310a58dcb913b81a036cf7996e3f38221be3d762221681c99021fd"} Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.694595 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6ghsk" event={"ID":"7c3207de-78eb-41b2-a2be-163c9a3532af","Type":"ContainerStarted","Data":"08ee56a39f8f46204120acce7a7bf8c28c87e2b709294c817fd0c13ee4d4a7a9"} Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.694632 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6ghsk" event={"ID":"7c3207de-78eb-41b2-a2be-163c9a3532af","Type":"ContainerStarted","Data":"705442df5471d8d9123ad6eb4f460952fe7c2f7c1d2b7f92093a2813227350d6"} Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.694865 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-644597f84c-58wp9" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" containerName="dnsmasq-dns" containerID="cri-o://955dbceda4337a0b3969ade627b61b40ada5b3d27c0047027d9e9c4f1b629dd4" gracePeriod=10 Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.695707 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77597f887-jncg2" podUID="982c6d04-b346-4001-b6e5-2772413a1172" containerName="dnsmasq-dns" containerID="cri-o://862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0" gracePeriod=10 Oct 08 18:28:31 crc kubenswrapper[4750]: I1008 18:28:31.781610 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6ghsk" podStartSLOduration=1.781593112 podStartE2EDuration="1.781593112s" podCreationTimestamp="2025-10-08 18:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:28:31.739696534 +0000 UTC m=+1067.652667557" watchObservedRunningTime="2025-10-08 18:28:31.781593112 +0000 UTC m=+1067.694564125" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.501390 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.501696 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.568149 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.689848 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.714459 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" event={"ID":"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd","Type":"ContainerStarted","Data":"dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1"} Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.714987 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.726566 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" event={"ID":"6c6270bb-d80a-45b1-bc51-44f1c920863e","Type":"ContainerStarted","Data":"ef2d43b0a1e08b5ce925b8ca370d098ac4420274e14debff61bbb8eb0b883e2b"} Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.726850 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.731795 4750 generic.go:334] "Generic (PLEG): container finished" podID="982c6d04-b346-4001-b6e5-2772413a1172" containerID="862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0" exitCode=0 Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.731884 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jncg2" event={"ID":"982c6d04-b346-4001-b6e5-2772413a1172","Type":"ContainerDied","Data":"862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0"} Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.731903 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jncg2" event={"ID":"982c6d04-b346-4001-b6e5-2772413a1172","Type":"ContainerDied","Data":"44c61f97ff03820185af3b1e8275235dd51895bec84254bd2bf912a5331af6ce"} Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.731923 4750 scope.go:117] "RemoveContainer" containerID="862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.732011 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-jncg2" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.735501 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" podStartSLOduration=2.735490707 podStartE2EDuration="2.735490707s" podCreationTimestamp="2025-10-08 18:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:28:32.733475168 +0000 UTC m=+1068.646446191" watchObservedRunningTime="2025-10-08 18:28:32.735490707 +0000 UTC m=+1068.648461720" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.740764 4750 generic.go:334] "Generic (PLEG): container finished" podID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" containerID="955dbceda4337a0b3969ade627b61b40ada5b3d27c0047027d9e9c4f1b629dd4" exitCode=0 Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.749421 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-58wp9" event={"ID":"fdfa2238-e6b5-40ce-9945-ce12691e2ef6","Type":"ContainerDied","Data":"955dbceda4337a0b3969ade627b61b40ada5b3d27c0047027d9e9c4f1b629dd4"} Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.758438 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" podStartSLOduration=3.7584195400000002 podStartE2EDuration="3.75841954s" podCreationTimestamp="2025-10-08 18:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:28:32.754085433 +0000 UTC m=+1068.667056476" watchObservedRunningTime="2025-10-08 18:28:32.75841954 +0000 UTC m=+1068.671390553" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.772514 4750 scope.go:117] "RemoveContainer" containerID="e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.785383 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.833142 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xs8q\" (UniqueName: \"kubernetes.io/projected/982c6d04-b346-4001-b6e5-2772413a1172-kube-api-access-9xs8q\") pod \"982c6d04-b346-4001-b6e5-2772413a1172\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.833275 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-config\") pod \"982c6d04-b346-4001-b6e5-2772413a1172\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.833334 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-dns-svc\") pod \"982c6d04-b346-4001-b6e5-2772413a1172\" (UID: \"982c6d04-b346-4001-b6e5-2772413a1172\") " Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.840160 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982c6d04-b346-4001-b6e5-2772413a1172-kube-api-access-9xs8q" (OuterVolumeSpecName: "kube-api-access-9xs8q") pod "982c6d04-b346-4001-b6e5-2772413a1172" (UID: "982c6d04-b346-4001-b6e5-2772413a1172"). InnerVolumeSpecName "kube-api-access-9xs8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.879764 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "982c6d04-b346-4001-b6e5-2772413a1172" (UID: "982c6d04-b346-4001-b6e5-2772413a1172"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.880771 4750 scope.go:117] "RemoveContainer" containerID="862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0" Oct 08 18:28:32 crc kubenswrapper[4750]: E1008 18:28:32.881177 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0\": container with ID starting with 862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0 not found: ID does not exist" containerID="862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.881213 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0"} err="failed to get container status \"862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0\": rpc error: code = NotFound desc = could not find container \"862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0\": container with ID starting with 862f2ae1313c2a369876cff84c6c818eb9a7937d95d28315d05cf56ef2bf69b0 not found: ID does not exist" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.881232 4750 scope.go:117] "RemoveContainer" containerID="e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51" Oct 08 18:28:32 crc kubenswrapper[4750]: E1008 18:28:32.882913 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51\": container with ID starting with e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51 not found: ID does not exist" containerID="e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.882933 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51"} err="failed to get container status \"e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51\": rpc error: code = NotFound desc = could not find container \"e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51\": container with ID starting with e0c976187cfd4586be20027444ce18bd67bd5505f9ef5d3065d49ceb58dddd51 not found: ID does not exist" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.883600 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.885913 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-config" (OuterVolumeSpecName: "config") pod "982c6d04-b346-4001-b6e5-2772413a1172" (UID: "982c6d04-b346-4001-b6e5-2772413a1172"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.935081 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xs8q\" (UniqueName: \"kubernetes.io/projected/982c6d04-b346-4001-b6e5-2772413a1172-kube-api-access-9xs8q\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.935118 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:32 crc kubenswrapper[4750]: I1008 18:28:32.935127 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982c6d04-b346-4001-b6e5-2772413a1172-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.035819 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zbwj\" (UniqueName: \"kubernetes.io/projected/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-kube-api-access-4zbwj\") pod \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.035994 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-config\") pod \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.036138 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-dns-svc\") pod \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\" (UID: \"fdfa2238-e6b5-40ce-9945-ce12691e2ef6\") " Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.040310 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-kube-api-access-4zbwj" (OuterVolumeSpecName: "kube-api-access-4zbwj") pod "fdfa2238-e6b5-40ce-9945-ce12691e2ef6" (UID: "fdfa2238-e6b5-40ce-9945-ce12691e2ef6"). InnerVolumeSpecName "kube-api-access-4zbwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.065537 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jncg2"] Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.070482 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jncg2"] Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.072514 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-config" (OuterVolumeSpecName: "config") pod "fdfa2238-e6b5-40ce-9945-ce12691e2ef6" (UID: "fdfa2238-e6b5-40ce-9945-ce12691e2ef6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.079498 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fdfa2238-e6b5-40ce-9945-ce12691e2ef6" (UID: "fdfa2238-e6b5-40ce-9945-ce12691e2ef6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.137700 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.137735 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zbwj\" (UniqueName: \"kubernetes.io/projected/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-kube-api-access-4zbwj\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.137749 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfa2238-e6b5-40ce-9945-ce12691e2ef6-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.761277 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-58wp9" event={"ID":"fdfa2238-e6b5-40ce-9945-ce12691e2ef6","Type":"ContainerDied","Data":"c251e924ff9ae38776bcc49467ef425a8b1c0255a1e3cdd0a4e6a62ac643cc9c"} Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.761322 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-58wp9" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.761612 4750 scope.go:117] "RemoveContainer" containerID="955dbceda4337a0b3969ade627b61b40ada5b3d27c0047027d9e9c4f1b629dd4" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.763579 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"230c02f8-af60-40d6-af19-adf730eec43f","Type":"ContainerStarted","Data":"36265d856955e91441b681c8365511d7edd515654902aa5545ca450e749f6e36"} Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.763607 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"230c02f8-af60-40d6-af19-adf730eec43f","Type":"ContainerStarted","Data":"c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc"} Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.768641 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xrdld"] Oct 08 18:28:33 crc kubenswrapper[4750]: E1008 18:28:33.768909 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" containerName="dnsmasq-dns" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.768928 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" containerName="dnsmasq-dns" Oct 08 18:28:33 crc kubenswrapper[4750]: E1008 18:28:33.769011 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982c6d04-b346-4001-b6e5-2772413a1172" containerName="init" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.769022 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="982c6d04-b346-4001-b6e5-2772413a1172" containerName="init" Oct 08 18:28:33 crc kubenswrapper[4750]: E1008 18:28:33.769034 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982c6d04-b346-4001-b6e5-2772413a1172" containerName="dnsmasq-dns" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.769043 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="982c6d04-b346-4001-b6e5-2772413a1172" containerName="dnsmasq-dns" Oct 08 18:28:33 crc kubenswrapper[4750]: E1008 18:28:33.769057 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" containerName="init" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.769064 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" containerName="init" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.769283 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="982c6d04-b346-4001-b6e5-2772413a1172" containerName="dnsmasq-dns" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.769315 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" containerName="dnsmasq-dns" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.769856 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xrdld" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.782498 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xrdld"] Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.805872 4750 scope.go:117] "RemoveContainer" containerID="e546b39fe688cdced3980b827f6351a12a4ada1384eb69bdb13bcec1ec8a5a25" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.816382 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.7610577269999999 podStartE2EDuration="3.816358175s" podCreationTimestamp="2025-10-08 18:28:30 +0000 UTC" firstStartedPulling="2025-10-08 18:28:31.10020774 +0000 UTC m=+1067.013178753" lastFinishedPulling="2025-10-08 18:28:33.155508188 +0000 UTC m=+1069.068479201" observedRunningTime="2025-10-08 18:28:33.788346288 +0000 UTC m=+1069.701317311" watchObservedRunningTime="2025-10-08 18:28:33.816358175 +0000 UTC m=+1069.729329188" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.832681 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-58wp9"] Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.838563 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-58wp9"] Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.893322 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-b49zx"] Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.896294 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b49zx" Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.905538 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b49zx"] Oct 08 18:28:33 crc kubenswrapper[4750]: I1008 18:28:33.950781 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nb66\" (UniqueName: \"kubernetes.io/projected/e09de111-101a-4976-b327-2281bbc6b573-kube-api-access-9nb66\") pod \"keystone-db-create-xrdld\" (UID: \"e09de111-101a-4976-b327-2281bbc6b573\") " pod="openstack/keystone-db-create-xrdld" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.053492 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nb66\" (UniqueName: \"kubernetes.io/projected/e09de111-101a-4976-b327-2281bbc6b573-kube-api-access-9nb66\") pod \"keystone-db-create-xrdld\" (UID: \"e09de111-101a-4976-b327-2281bbc6b573\") " pod="openstack/keystone-db-create-xrdld" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.053600 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wts84\" (UniqueName: \"kubernetes.io/projected/fc8a957d-821b-438f-9b31-aabc6a3672e0-kube-api-access-wts84\") pod \"placement-db-create-b49zx\" (UID: \"fc8a957d-821b-438f-9b31-aabc6a3672e0\") " pod="openstack/placement-db-create-b49zx" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.073022 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nb66\" (UniqueName: \"kubernetes.io/projected/e09de111-101a-4976-b327-2281bbc6b573-kube-api-access-9nb66\") pod \"keystone-db-create-xrdld\" (UID: \"e09de111-101a-4976-b327-2281bbc6b573\") " pod="openstack/keystone-db-create-xrdld" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.087065 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xrdld" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.156591 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wts84\" (UniqueName: \"kubernetes.io/projected/fc8a957d-821b-438f-9b31-aabc6a3672e0-kube-api-access-wts84\") pod \"placement-db-create-b49zx\" (UID: \"fc8a957d-821b-438f-9b31-aabc6a3672e0\") " pod="openstack/placement-db-create-b49zx" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.177619 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wts84\" (UniqueName: \"kubernetes.io/projected/fc8a957d-821b-438f-9b31-aabc6a3672e0-kube-api-access-wts84\") pod \"placement-db-create-b49zx\" (UID: \"fc8a957d-821b-438f-9b31-aabc6a3672e0\") " pod="openstack/placement-db-create-b49zx" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.213422 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b49zx" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.487901 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xrdld"] Oct 08 18:28:34 crc kubenswrapper[4750]: W1008 18:28:34.500682 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09de111_101a_4976_b327_2281bbc6b573.slice/crio-ef19a8e19861288282807621bf322a6cf2ab2d57a07814a8d01fffbfabc9c78d WatchSource:0}: Error finding container ef19a8e19861288282807621bf322a6cf2ab2d57a07814a8d01fffbfabc9c78d: Status 404 returned error can't find the container with id ef19a8e19861288282807621bf322a6cf2ab2d57a07814a8d01fffbfabc9c78d Oct 08 18:28:34 crc kubenswrapper[4750]: W1008 18:28:34.630429 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc8a957d_821b_438f_9b31_aabc6a3672e0.slice/crio-1d1b88874aeb79ef497c98ff1db78519947867670f2c348aa6e929f4a5ce35a3 WatchSource:0}: Error finding container 1d1b88874aeb79ef497c98ff1db78519947867670f2c348aa6e929f4a5ce35a3: Status 404 returned error can't find the container with id 1d1b88874aeb79ef497c98ff1db78519947867670f2c348aa6e929f4a5ce35a3 Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.635933 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b49zx"] Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.750666 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982c6d04-b346-4001-b6e5-2772413a1172" path="/var/lib/kubelet/pods/982c6d04-b346-4001-b6e5-2772413a1172/volumes" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.752671 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdfa2238-e6b5-40ce-9945-ce12691e2ef6" path="/var/lib/kubelet/pods/fdfa2238-e6b5-40ce-9945-ce12691e2ef6/volumes" Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.780300 4750 generic.go:334] "Generic (PLEG): container finished" podID="e09de111-101a-4976-b327-2281bbc6b573" containerID="1498b7259b258a7e9a9da15f304cde6936a66b35578b17491b3ab09c9237c4c5" exitCode=0 Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.780525 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xrdld" event={"ID":"e09de111-101a-4976-b327-2281bbc6b573","Type":"ContainerDied","Data":"1498b7259b258a7e9a9da15f304cde6936a66b35578b17491b3ab09c9237c4c5"} Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.780594 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xrdld" event={"ID":"e09de111-101a-4976-b327-2281bbc6b573","Type":"ContainerStarted","Data":"ef19a8e19861288282807621bf322a6cf2ab2d57a07814a8d01fffbfabc9c78d"} Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.788833 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b49zx" event={"ID":"fc8a957d-821b-438f-9b31-aabc6a3672e0","Type":"ContainerStarted","Data":"1d1b88874aeb79ef497c98ff1db78519947867670f2c348aa6e929f4a5ce35a3"} Oct 08 18:28:34 crc kubenswrapper[4750]: I1008 18:28:34.789016 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.666266 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-ds2hc"] Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.666917 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" podUID="6c6270bb-d80a-45b1-bc51-44f1c920863e" containerName="dnsmasq-dns" containerID="cri-o://ef2d43b0a1e08b5ce925b8ca370d098ac4420274e14debff61bbb8eb0b883e2b" gracePeriod=10 Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.723202 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-btv56"] Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.725909 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.753279 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-btv56"] Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.758097 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.826439 4750 generic.go:334] "Generic (PLEG): container finished" podID="fc8a957d-821b-438f-9b31-aabc6a3672e0" containerID="b1122e7597d4ce226d03941ae4d26bd429a007b8def572c79b06e0ed78ba9d3d" exitCode=0 Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.826595 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b49zx" event={"ID":"fc8a957d-821b-438f-9b31-aabc6a3672e0","Type":"ContainerDied","Data":"b1122e7597d4ce226d03941ae4d26bd429a007b8def572c79b06e0ed78ba9d3d"} Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.828787 4750 generic.go:334] "Generic (PLEG): container finished" podID="6c6270bb-d80a-45b1-bc51-44f1c920863e" containerID="ef2d43b0a1e08b5ce925b8ca370d098ac4420274e14debff61bbb8eb0b883e2b" exitCode=0 Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.829751 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" event={"ID":"6c6270bb-d80a-45b1-bc51-44f1c920863e","Type":"ContainerDied","Data":"ef2d43b0a1e08b5ce925b8ca370d098ac4420274e14debff61bbb8eb0b883e2b"} Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.884809 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2d6j\" (UniqueName: \"kubernetes.io/projected/5e0538d0-2633-437d-b5e3-3a397000a601-kube-api-access-h2d6j\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.884883 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-sb\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.884956 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-nb\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.885323 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-dns-svc\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.885384 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-config\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.991422 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-dns-svc\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.991543 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-config\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.991623 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2d6j\" (UniqueName: \"kubernetes.io/projected/5e0538d0-2633-437d-b5e3-3a397000a601-kube-api-access-h2d6j\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.991693 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-sb\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.991785 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-nb\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.993124 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-dns-svc\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.994108 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-config\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:35 crc kubenswrapper[4750]: I1008 18:28:35.995397 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-nb\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.004037 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-sb\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.018814 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2d6j\" (UniqueName: \"kubernetes.io/projected/5e0538d0-2633-437d-b5e3-3a397000a601-kube-api-access-h2d6j\") pod \"dnsmasq-dns-7b587f8db7-btv56\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.069130 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.356875 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.366827 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xrdld" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.501765 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5qkq\" (UniqueName: \"kubernetes.io/projected/6c6270bb-d80a-45b1-bc51-44f1c920863e-kube-api-access-d5qkq\") pod \"6c6270bb-d80a-45b1-bc51-44f1c920863e\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.501813 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-dns-svc\") pod \"6c6270bb-d80a-45b1-bc51-44f1c920863e\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.501843 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config\") pod \"6c6270bb-d80a-45b1-bc51-44f1c920863e\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.501877 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nb66\" (UniqueName: \"kubernetes.io/projected/e09de111-101a-4976-b327-2281bbc6b573-kube-api-access-9nb66\") pod \"e09de111-101a-4976-b327-2281bbc6b573\" (UID: \"e09de111-101a-4976-b327-2281bbc6b573\") " Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.501965 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-ovsdbserver-nb\") pod \"6c6270bb-d80a-45b1-bc51-44f1c920863e\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.509710 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c6270bb-d80a-45b1-bc51-44f1c920863e-kube-api-access-d5qkq" (OuterVolumeSpecName: "kube-api-access-d5qkq") pod "6c6270bb-d80a-45b1-bc51-44f1c920863e" (UID: "6c6270bb-d80a-45b1-bc51-44f1c920863e"). InnerVolumeSpecName "kube-api-access-d5qkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.511761 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09de111-101a-4976-b327-2281bbc6b573-kube-api-access-9nb66" (OuterVolumeSpecName: "kube-api-access-9nb66") pod "e09de111-101a-4976-b327-2281bbc6b573" (UID: "e09de111-101a-4976-b327-2281bbc6b573"). InnerVolumeSpecName "kube-api-access-9nb66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:36 crc kubenswrapper[4750]: E1008 18:28:36.583993 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config podName:6c6270bb-d80a-45b1-bc51-44f1c920863e nodeName:}" failed. No retries permitted until 2025-10-08 18:28:37.083961373 +0000 UTC m=+1072.996932386 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config") pod "6c6270bb-d80a-45b1-bc51-44f1c920863e" (UID: "6c6270bb-d80a-45b1-bc51-44f1c920863e") : error deleting /var/lib/kubelet/pods/6c6270bb-d80a-45b1-bc51-44f1c920863e/volume-subpaths: remove /var/lib/kubelet/pods/6c6270bb-d80a-45b1-bc51-44f1c920863e/volume-subpaths: no such file or directory Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.584196 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c6270bb-d80a-45b1-bc51-44f1c920863e" (UID: "6c6270bb-d80a-45b1-bc51-44f1c920863e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.584225 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c6270bb-d80a-45b1-bc51-44f1c920863e" (UID: "6c6270bb-d80a-45b1-bc51-44f1c920863e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.603508 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5qkq\" (UniqueName: \"kubernetes.io/projected/6c6270bb-d80a-45b1-bc51-44f1c920863e-kube-api-access-d5qkq\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.603565 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.603577 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nb66\" (UniqueName: \"kubernetes.io/projected/e09de111-101a-4976-b327-2281bbc6b573-kube-api-access-9nb66\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.603587 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.660309 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-btv56"] Oct 08 18:28:36 crc kubenswrapper[4750]: W1008 18:28:36.669472 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0538d0_2633_437d_b5e3_3a397000a601.slice/crio-2fe25efd24d87a2a3273adb23444542dd2cbc48a800642866621f8a3ce99e616 WatchSource:0}: Error finding container 2fe25efd24d87a2a3273adb23444542dd2cbc48a800642866621f8a3ce99e616: Status 404 returned error can't find the container with id 2fe25efd24d87a2a3273adb23444542dd2cbc48a800642866621f8a3ce99e616 Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.806005 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 08 18:28:36 crc kubenswrapper[4750]: E1008 18:28:36.806511 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09de111-101a-4976-b327-2281bbc6b573" containerName="mariadb-database-create" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.806531 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09de111-101a-4976-b327-2281bbc6b573" containerName="mariadb-database-create" Oct 08 18:28:36 crc kubenswrapper[4750]: E1008 18:28:36.806539 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6270bb-d80a-45b1-bc51-44f1c920863e" containerName="dnsmasq-dns" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.806559 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6270bb-d80a-45b1-bc51-44f1c920863e" containerName="dnsmasq-dns" Oct 08 18:28:36 crc kubenswrapper[4750]: E1008 18:28:36.806593 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6270bb-d80a-45b1-bc51-44f1c920863e" containerName="init" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.806630 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6270bb-d80a-45b1-bc51-44f1c920863e" containerName="init" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.806978 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c6270bb-d80a-45b1-bc51-44f1c920863e" containerName="dnsmasq-dns" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.807006 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09de111-101a-4976-b327-2281bbc6b573" containerName="mariadb-database-create" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.817755 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.822920 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.823271 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.823544 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5zxp2" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.823746 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.828995 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.840121 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xrdld" event={"ID":"e09de111-101a-4976-b327-2281bbc6b573","Type":"ContainerDied","Data":"ef19a8e19861288282807621bf322a6cf2ab2d57a07814a8d01fffbfabc9c78d"} Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.840156 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef19a8e19861288282807621bf322a6cf2ab2d57a07814a8d01fffbfabc9c78d" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.840207 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xrdld" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.841649 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" event={"ID":"5e0538d0-2633-437d-b5e3-3a397000a601","Type":"ContainerStarted","Data":"2fe25efd24d87a2a3273adb23444542dd2cbc48a800642866621f8a3ce99e616"} Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.843002 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.843219 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6b595d95-ds2hc" event={"ID":"6c6270bb-d80a-45b1-bc51-44f1c920863e","Type":"ContainerDied","Data":"ce0cde6782310a58dcb913b81a036cf7996e3f38221be3d762221681c99021fd"} Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.843242 4750 scope.go:117] "RemoveContainer" containerID="ef2d43b0a1e08b5ce925b8ca370d098ac4420274e14debff61bbb8eb0b883e2b" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.865135 4750 scope.go:117] "RemoveContainer" containerID="3ab5a7fa33878a0fe750c66dbfd675bdb4e428f4b249383ff705b0ac29414e3b" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.907661 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-lock\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.907730 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-cache\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.907780 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.907804 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkl8g\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-kube-api-access-kkl8g\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:36 crc kubenswrapper[4750]: I1008 18:28:36.907888 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.010001 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.010041 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkl8g\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-kube-api-access-kkl8g\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.010113 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.010650 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: E1008 18:28:37.010746 4750 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 18:28:37 crc kubenswrapper[4750]: E1008 18:28:37.010761 4750 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 18:28:37 crc kubenswrapper[4750]: E1008 18:28:37.010795 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift podName:be62333a-e650-4131-b0f1-c8c484539c7e nodeName:}" failed. No retries permitted until 2025-10-08 18:28:37.510781281 +0000 UTC m=+1073.423752294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift") pod "swift-storage-0" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e") : configmap "swift-ring-files" not found Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.010918 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-lock\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.010949 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-cache\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.011401 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-lock\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.011509 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-cache\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.034895 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkl8g\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-kube-api-access-kkl8g\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.039754 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.104843 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b49zx" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.113274 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config\") pod \"6c6270bb-d80a-45b1-bc51-44f1c920863e\" (UID: \"6c6270bb-d80a-45b1-bc51-44f1c920863e\") " Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.113699 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config" (OuterVolumeSpecName: "config") pod "6c6270bb-d80a-45b1-bc51-44f1c920863e" (UID: "6c6270bb-d80a-45b1-bc51-44f1c920863e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.114134 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6270bb-d80a-45b1-bc51-44f1c920863e-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.183301 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-ds2hc"] Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.190959 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6b595d95-ds2hc"] Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.215422 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wts84\" (UniqueName: \"kubernetes.io/projected/fc8a957d-821b-438f-9b31-aabc6a3672e0-kube-api-access-wts84\") pod \"fc8a957d-821b-438f-9b31-aabc6a3672e0\" (UID: \"fc8a957d-821b-438f-9b31-aabc6a3672e0\") " Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.236004 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8a957d-821b-438f-9b31-aabc6a3672e0-kube-api-access-wts84" (OuterVolumeSpecName: "kube-api-access-wts84") pod "fc8a957d-821b-438f-9b31-aabc6a3672e0" (UID: "fc8a957d-821b-438f-9b31-aabc6a3672e0"). InnerVolumeSpecName "kube-api-access-wts84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.317382 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wts84\" (UniqueName: \"kubernetes.io/projected/fc8a957d-821b-438f-9b31-aabc6a3672e0-kube-api-access-wts84\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.366211 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-n2gsb"] Oct 08 18:28:37 crc kubenswrapper[4750]: E1008 18:28:37.366618 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a957d-821b-438f-9b31-aabc6a3672e0" containerName="mariadb-database-create" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.366634 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a957d-821b-438f-9b31-aabc6a3672e0" containerName="mariadb-database-create" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.366826 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8a957d-821b-438f-9b31-aabc6a3672e0" containerName="mariadb-database-create" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.367497 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.374750 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.375633 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.377264 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.394571 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-n2gsb"] Oct 08 18:28:37 crc kubenswrapper[4750]: E1008 18:28:37.397182 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-vxdq2 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-vxdq2 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-n2gsb" podUID="344145bf-4ecf-4389-af5a-a48d4ecf13c2" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.400686 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-fnr22"] Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.401706 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.422152 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fnr22"] Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.432653 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-n2gsb"] Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520301 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-dispersionconf\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520337 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-scripts\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520376 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxdq2\" (UniqueName: \"kubernetes.io/projected/344145bf-4ecf-4389-af5a-a48d4ecf13c2-kube-api-access-vxdq2\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520393 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/344145bf-4ecf-4389-af5a-a48d4ecf13c2-etc-swift\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520414 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pm6b\" (UniqueName: \"kubernetes.io/projected/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-kube-api-access-5pm6b\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520432 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-swiftconf\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520449 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-dispersionconf\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520474 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520502 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-etc-swift\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520518 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-ring-data-devices\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520532 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-ring-data-devices\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520579 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-scripts\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520596 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-combined-ca-bundle\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520613 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-combined-ca-bundle\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.520632 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-swiftconf\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: E1008 18:28:37.520824 4750 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 18:28:37 crc kubenswrapper[4750]: E1008 18:28:37.520837 4750 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 18:28:37 crc kubenswrapper[4750]: E1008 18:28:37.520870 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift podName:be62333a-e650-4131-b0f1-c8c484539c7e nodeName:}" failed. No retries permitted until 2025-10-08 18:28:38.52085784 +0000 UTC m=+1074.433828853 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift") pod "swift-storage-0" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e") : configmap "swift-ring-files" not found Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622191 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-combined-ca-bundle\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622231 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-swiftconf\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622300 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-dispersionconf\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622321 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-scripts\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622360 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxdq2\" (UniqueName: \"kubernetes.io/projected/344145bf-4ecf-4389-af5a-a48d4ecf13c2-kube-api-access-vxdq2\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622378 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/344145bf-4ecf-4389-af5a-a48d4ecf13c2-etc-swift\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622398 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pm6b\" (UniqueName: \"kubernetes.io/projected/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-kube-api-access-5pm6b\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622417 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-swiftconf\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622440 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-dispersionconf\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622477 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-etc-swift\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622503 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-ring-data-devices\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622519 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-ring-data-devices\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622542 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-scripts\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.622573 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-combined-ca-bundle\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.623935 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-ring-data-devices\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.624075 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-scripts\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.624129 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-etc-swift\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.624066 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/344145bf-4ecf-4389-af5a-a48d4ecf13c2-etc-swift\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.624397 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-scripts\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.626516 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-ring-data-devices\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.627042 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-swiftconf\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.627368 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-combined-ca-bundle\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.627734 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-dispersionconf\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.627926 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-swiftconf\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.628438 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-dispersionconf\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.630235 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-combined-ca-bundle\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.644247 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pm6b\" (UniqueName: \"kubernetes.io/projected/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-kube-api-access-5pm6b\") pod \"swift-ring-rebalance-fnr22\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.646068 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxdq2\" (UniqueName: \"kubernetes.io/projected/344145bf-4ecf-4389-af5a-a48d4ecf13c2-kube-api-access-vxdq2\") pod \"swift-ring-rebalance-n2gsb\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.718530 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.850947 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b49zx" event={"ID":"fc8a957d-821b-438f-9b31-aabc6a3672e0","Type":"ContainerDied","Data":"1d1b88874aeb79ef497c98ff1db78519947867670f2c348aa6e929f4a5ce35a3"} Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.851015 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d1b88874aeb79ef497c98ff1db78519947867670f2c348aa6e929f4a5ce35a3" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.850980 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b49zx" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.854085 4750 generic.go:334] "Generic (PLEG): container finished" podID="5e0538d0-2633-437d-b5e3-3a397000a601" containerID="c3a7105f7b8bd3c9e326bdba4e4939b8c39174000473360506f5ab0a1cabd421" exitCode=0 Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.854146 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.854700 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" event={"ID":"5e0538d0-2633-437d-b5e3-3a397000a601","Type":"ContainerDied","Data":"c3a7105f7b8bd3c9e326bdba4e4939b8c39174000473360506f5ab0a1cabd421"} Oct 08 18:28:37 crc kubenswrapper[4750]: I1008 18:28:37.882163 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.029767 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/344145bf-4ecf-4389-af5a-a48d4ecf13c2-etc-swift\") pod \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.029820 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-scripts\") pod \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.029860 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-combined-ca-bundle\") pod \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.029965 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-dispersionconf\") pod \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.029991 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxdq2\" (UniqueName: \"kubernetes.io/projected/344145bf-4ecf-4389-af5a-a48d4ecf13c2-kube-api-access-vxdq2\") pod \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.030020 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-swiftconf\") pod \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.030072 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-ring-data-devices\") pod \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\" (UID: \"344145bf-4ecf-4389-af5a-a48d4ecf13c2\") " Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.030409 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/344145bf-4ecf-4389-af5a-a48d4ecf13c2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "344145bf-4ecf-4389-af5a-a48d4ecf13c2" (UID: "344145bf-4ecf-4389-af5a-a48d4ecf13c2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.030873 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-scripts" (OuterVolumeSpecName: "scripts") pod "344145bf-4ecf-4389-af5a-a48d4ecf13c2" (UID: "344145bf-4ecf-4389-af5a-a48d4ecf13c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.033356 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "344145bf-4ecf-4389-af5a-a48d4ecf13c2" (UID: "344145bf-4ecf-4389-af5a-a48d4ecf13c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.033449 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "344145bf-4ecf-4389-af5a-a48d4ecf13c2" (UID: "344145bf-4ecf-4389-af5a-a48d4ecf13c2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.034246 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "344145bf-4ecf-4389-af5a-a48d4ecf13c2" (UID: "344145bf-4ecf-4389-af5a-a48d4ecf13c2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.035505 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "344145bf-4ecf-4389-af5a-a48d4ecf13c2" (UID: "344145bf-4ecf-4389-af5a-a48d4ecf13c2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.035570 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/344145bf-4ecf-4389-af5a-a48d4ecf13c2-kube-api-access-vxdq2" (OuterVolumeSpecName: "kube-api-access-vxdq2") pod "344145bf-4ecf-4389-af5a-a48d4ecf13c2" (UID: "344145bf-4ecf-4389-af5a-a48d4ecf13c2"). InnerVolumeSpecName "kube-api-access-vxdq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.133420 4750 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/344145bf-4ecf-4389-af5a-a48d4ecf13c2-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.133463 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.133481 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.133495 4750 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.133508 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxdq2\" (UniqueName: \"kubernetes.io/projected/344145bf-4ecf-4389-af5a-a48d4ecf13c2-kube-api-access-vxdq2\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.133520 4750 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/344145bf-4ecf-4389-af5a-a48d4ecf13c2-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.133531 4750 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/344145bf-4ecf-4389-af5a-a48d4ecf13c2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.152085 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fnr22"] Oct 08 18:28:38 crc kubenswrapper[4750]: W1008 18:28:38.158771 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7ca8f71_a72e_4a6d_b839_5524cfb3d70b.slice/crio-28eb8d186c889b4179086a8294ceca8a974c84739f0ce52b0abb7bc9aafebca0 WatchSource:0}: Error finding container 28eb8d186c889b4179086a8294ceca8a974c84739f0ce52b0abb7bc9aafebca0: Status 404 returned error can't find the container with id 28eb8d186c889b4179086a8294ceca8a974c84739f0ce52b0abb7bc9aafebca0 Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.538972 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:38 crc kubenswrapper[4750]: E1008 18:28:38.539147 4750 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 18:28:38 crc kubenswrapper[4750]: E1008 18:28:38.539703 4750 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 18:28:38 crc kubenswrapper[4750]: E1008 18:28:38.539865 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift podName:be62333a-e650-4131-b0f1-c8c484539c7e nodeName:}" failed. No retries permitted until 2025-10-08 18:28:40.539842522 +0000 UTC m=+1076.452813545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift") pod "swift-storage-0" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e") : configmap "swift-ring-files" not found Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.743133 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c6270bb-d80a-45b1-bc51-44f1c920863e" path="/var/lib/kubelet/pods/6c6270bb-d80a-45b1-bc51-44f1c920863e/volumes" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.861579 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fnr22" event={"ID":"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b","Type":"ContainerStarted","Data":"28eb8d186c889b4179086a8294ceca8a974c84739f0ce52b0abb7bc9aafebca0"} Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.864302 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n2gsb" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.864303 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" event={"ID":"5e0538d0-2633-437d-b5e3-3a397000a601","Type":"ContainerStarted","Data":"ef619e03de0bd8aa5ccbf31154f9e5ff769e7e2aabe7c00ba20d3fc73e445cc8"} Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.864417 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.894063 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" podStartSLOduration=3.894028739 podStartE2EDuration="3.894028739s" podCreationTimestamp="2025-10-08 18:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:28:38.891171119 +0000 UTC m=+1074.804142132" watchObservedRunningTime="2025-10-08 18:28:38.894028739 +0000 UTC m=+1074.806999762" Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.924271 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-n2gsb"] Oct 08 18:28:38 crc kubenswrapper[4750]: I1008 18:28:38.928870 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-n2gsb"] Oct 08 18:28:39 crc kubenswrapper[4750]: I1008 18:28:39.149597 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pbmmr"] Oct 08 18:28:39 crc kubenswrapper[4750]: I1008 18:28:39.150760 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pbmmr" Oct 08 18:28:39 crc kubenswrapper[4750]: I1008 18:28:39.157735 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pbmmr"] Oct 08 18:28:39 crc kubenswrapper[4750]: I1008 18:28:39.252717 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmdf\" (UniqueName: \"kubernetes.io/projected/65af30d1-7ae5-485a-85d4-271a4642c2cf-kube-api-access-6gmdf\") pod \"glance-db-create-pbmmr\" (UID: \"65af30d1-7ae5-485a-85d4-271a4642c2cf\") " pod="openstack/glance-db-create-pbmmr" Oct 08 18:28:39 crc kubenswrapper[4750]: I1008 18:28:39.354098 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmdf\" (UniqueName: \"kubernetes.io/projected/65af30d1-7ae5-485a-85d4-271a4642c2cf-kube-api-access-6gmdf\") pod \"glance-db-create-pbmmr\" (UID: \"65af30d1-7ae5-485a-85d4-271a4642c2cf\") " pod="openstack/glance-db-create-pbmmr" Oct 08 18:28:39 crc kubenswrapper[4750]: I1008 18:28:39.373373 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmdf\" (UniqueName: \"kubernetes.io/projected/65af30d1-7ae5-485a-85d4-271a4642c2cf-kube-api-access-6gmdf\") pod \"glance-db-create-pbmmr\" (UID: \"65af30d1-7ae5-485a-85d4-271a4642c2cf\") " pod="openstack/glance-db-create-pbmmr" Oct 08 18:28:39 crc kubenswrapper[4750]: I1008 18:28:39.467481 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pbmmr" Oct 08 18:28:39 crc kubenswrapper[4750]: I1008 18:28:39.871976 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pbmmr"] Oct 08 18:28:39 crc kubenswrapper[4750]: W1008 18:28:39.879142 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65af30d1_7ae5_485a_85d4_271a4642c2cf.slice/crio-c71a1906bab5f8473f11caa024b81e3a2b053aa917214a8f557591c69f298181 WatchSource:0}: Error finding container c71a1906bab5f8473f11caa024b81e3a2b053aa917214a8f557591c69f298181: Status 404 returned error can't find the container with id c71a1906bab5f8473f11caa024b81e3a2b053aa917214a8f557591c69f298181 Oct 08 18:28:40 crc kubenswrapper[4750]: I1008 18:28:40.576095 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:40 crc kubenswrapper[4750]: E1008 18:28:40.576242 4750 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 18:28:40 crc kubenswrapper[4750]: E1008 18:28:40.576257 4750 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 18:28:40 crc kubenswrapper[4750]: E1008 18:28:40.576299 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift podName:be62333a-e650-4131-b0f1-c8c484539c7e nodeName:}" failed. No retries permitted until 2025-10-08 18:28:44.576286674 +0000 UTC m=+1080.489257687 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift") pod "swift-storage-0" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e") : configmap "swift-ring-files" not found Oct 08 18:28:40 crc kubenswrapper[4750]: I1008 18:28:40.576605 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:40 crc kubenswrapper[4750]: I1008 18:28:40.743077 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="344145bf-4ecf-4389-af5a-a48d4ecf13c2" path="/var/lib/kubelet/pods/344145bf-4ecf-4389-af5a-a48d4ecf13c2/volumes" Oct 08 18:28:40 crc kubenswrapper[4750]: I1008 18:28:40.885808 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pbmmr" event={"ID":"65af30d1-7ae5-485a-85d4-271a4642c2cf","Type":"ContainerStarted","Data":"c71a1906bab5f8473f11caa024b81e3a2b053aa917214a8f557591c69f298181"} Oct 08 18:28:41 crc kubenswrapper[4750]: I1008 18:28:41.895956 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pbmmr" event={"ID":"65af30d1-7ae5-485a-85d4-271a4642c2cf","Type":"ContainerStarted","Data":"44eb121f5a68d0a97c42240fc68cce4713a870335e4ddf0f8679f9aa32654ec7"} Oct 08 18:28:41 crc kubenswrapper[4750]: I1008 18:28:41.913692 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-pbmmr" podStartSLOduration=2.913674453 podStartE2EDuration="2.913674453s" podCreationTimestamp="2025-10-08 18:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:28:41.911813928 +0000 UTC m=+1077.824784941" watchObservedRunningTime="2025-10-08 18:28:41.913674453 +0000 UTC m=+1077.826645466" Oct 08 18:28:42 crc kubenswrapper[4750]: I1008 18:28:42.906996 4750 generic.go:334] "Generic (PLEG): container finished" podID="65af30d1-7ae5-485a-85d4-271a4642c2cf" containerID="44eb121f5a68d0a97c42240fc68cce4713a870335e4ddf0f8679f9aa32654ec7" exitCode=0 Oct 08 18:28:42 crc kubenswrapper[4750]: I1008 18:28:42.907086 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pbmmr" event={"ID":"65af30d1-7ae5-485a-85d4-271a4642c2cf","Type":"ContainerDied","Data":"44eb121f5a68d0a97c42240fc68cce4713a870335e4ddf0f8679f9aa32654ec7"} Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.716278 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2d53-account-create-n4mq6"] Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.718168 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2d53-account-create-n4mq6" Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.720153 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.725654 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2d53-account-create-n4mq6"] Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.827794 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5bjz\" (UniqueName: \"kubernetes.io/projected/c121e821-4fcc-4ea5-9844-3fb345b952e9-kube-api-access-n5bjz\") pod \"keystone-2d53-account-create-n4mq6\" (UID: \"c121e821-4fcc-4ea5-9844-3fb345b952e9\") " pod="openstack/keystone-2d53-account-create-n4mq6" Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.916413 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9f1c-account-create-6n79f"] Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.917931 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f1c-account-create-6n79f" Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.920785 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.928820 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5bjz\" (UniqueName: \"kubernetes.io/projected/c121e821-4fcc-4ea5-9844-3fb345b952e9-kube-api-access-n5bjz\") pod \"keystone-2d53-account-create-n4mq6\" (UID: \"c121e821-4fcc-4ea5-9844-3fb345b952e9\") " pod="openstack/keystone-2d53-account-create-n4mq6" Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.953737 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f1c-account-create-6n79f"] Oct 08 18:28:43 crc kubenswrapper[4750]: I1008 18:28:43.969775 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5bjz\" (UniqueName: \"kubernetes.io/projected/c121e821-4fcc-4ea5-9844-3fb345b952e9-kube-api-access-n5bjz\") pod \"keystone-2d53-account-create-n4mq6\" (UID: \"c121e821-4fcc-4ea5-9844-3fb345b952e9\") " pod="openstack/keystone-2d53-account-create-n4mq6" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.030882 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9grf\" (UniqueName: \"kubernetes.io/projected/d3678610-ef6c-49b3-a2e5-73d5431c1c4d-kube-api-access-m9grf\") pod \"placement-9f1c-account-create-6n79f\" (UID: \"d3678610-ef6c-49b3-a2e5-73d5431c1c4d\") " pod="openstack/placement-9f1c-account-create-6n79f" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.036398 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2d53-account-create-n4mq6" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.132957 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9grf\" (UniqueName: \"kubernetes.io/projected/d3678610-ef6c-49b3-a2e5-73d5431c1c4d-kube-api-access-m9grf\") pod \"placement-9f1c-account-create-6n79f\" (UID: \"d3678610-ef6c-49b3-a2e5-73d5431c1c4d\") " pod="openstack/placement-9f1c-account-create-6n79f" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.156041 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9grf\" (UniqueName: \"kubernetes.io/projected/d3678610-ef6c-49b3-a2e5-73d5431c1c4d-kube-api-access-m9grf\") pod \"placement-9f1c-account-create-6n79f\" (UID: \"d3678610-ef6c-49b3-a2e5-73d5431c1c4d\") " pod="openstack/placement-9f1c-account-create-6n79f" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.269847 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f1c-account-create-6n79f" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.506454 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pbmmr" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.640735 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gmdf\" (UniqueName: \"kubernetes.io/projected/65af30d1-7ae5-485a-85d4-271a4642c2cf-kube-api-access-6gmdf\") pod \"65af30d1-7ae5-485a-85d4-271a4642c2cf\" (UID: \"65af30d1-7ae5-485a-85d4-271a4642c2cf\") " Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.641200 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:44 crc kubenswrapper[4750]: E1008 18:28:44.641377 4750 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 18:28:44 crc kubenswrapper[4750]: E1008 18:28:44.641401 4750 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 18:28:44 crc kubenswrapper[4750]: E1008 18:28:44.641450 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift podName:be62333a-e650-4131-b0f1-c8c484539c7e nodeName:}" failed. No retries permitted until 2025-10-08 18:28:52.641433293 +0000 UTC m=+1088.554404306 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift") pod "swift-storage-0" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e") : configmap "swift-ring-files" not found Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.647080 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65af30d1-7ae5-485a-85d4-271a4642c2cf-kube-api-access-6gmdf" (OuterVolumeSpecName: "kube-api-access-6gmdf") pod "65af30d1-7ae5-485a-85d4-271a4642c2cf" (UID: "65af30d1-7ae5-485a-85d4-271a4642c2cf"). InnerVolumeSpecName "kube-api-access-6gmdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.745521 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gmdf\" (UniqueName: \"kubernetes.io/projected/65af30d1-7ae5-485a-85d4-271a4642c2cf-kube-api-access-6gmdf\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.867408 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f1c-account-create-6n79f"] Oct 08 18:28:44 crc kubenswrapper[4750]: W1008 18:28:44.873690 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3678610_ef6c_49b3_a2e5_73d5431c1c4d.slice/crio-9e8a1c23207134a14c29aa841ed7ce682a91db7739f6fd6f8ca5407f3abe9af7 WatchSource:0}: Error finding container 9e8a1c23207134a14c29aa841ed7ce682a91db7739f6fd6f8ca5407f3abe9af7: Status 404 returned error can't find the container with id 9e8a1c23207134a14c29aa841ed7ce682a91db7739f6fd6f8ca5407f3abe9af7 Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.875135 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2d53-account-create-n4mq6"] Oct 08 18:28:44 crc kubenswrapper[4750]: W1008 18:28:44.877237 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc121e821_4fcc_4ea5_9844_3fb345b952e9.slice/crio-d27277089d4583debe0695c1ffa4a4ea1d180e7353e29015f1eef40e4c7a12c7 WatchSource:0}: Error finding container d27277089d4583debe0695c1ffa4a4ea1d180e7353e29015f1eef40e4c7a12c7: Status 404 returned error can't find the container with id d27277089d4583debe0695c1ffa4a4ea1d180e7353e29015f1eef40e4c7a12c7 Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.878303 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.882362 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.962108 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fnr22" event={"ID":"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b","Type":"ContainerStarted","Data":"a7d155d2c3ef20e11a19f3b07465d37d9143db5dbca744ebcde32b355f9f1e8b"} Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.968540 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pbmmr" event={"ID":"65af30d1-7ae5-485a-85d4-271a4642c2cf","Type":"ContainerDied","Data":"c71a1906bab5f8473f11caa024b81e3a2b053aa917214a8f557591c69f298181"} Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.968578 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pbmmr" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.968605 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c71a1906bab5f8473f11caa024b81e3a2b053aa917214a8f557591c69f298181" Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.971352 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2d53-account-create-n4mq6" event={"ID":"c121e821-4fcc-4ea5-9844-3fb345b952e9","Type":"ContainerStarted","Data":"d27277089d4583debe0695c1ffa4a4ea1d180e7353e29015f1eef40e4c7a12c7"} Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.972456 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f1c-account-create-6n79f" event={"ID":"d3678610-ef6c-49b3-a2e5-73d5431c1c4d","Type":"ContainerStarted","Data":"9e8a1c23207134a14c29aa841ed7ce682a91db7739f6fd6f8ca5407f3abe9af7"} Oct 08 18:28:44 crc kubenswrapper[4750]: I1008 18:28:44.987881 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-fnr22" podStartSLOduration=1.708436307 podStartE2EDuration="7.987858649s" podCreationTimestamp="2025-10-08 18:28:37 +0000 UTC" firstStartedPulling="2025-10-08 18:28:38.161608886 +0000 UTC m=+1074.074579909" lastFinishedPulling="2025-10-08 18:28:44.441031238 +0000 UTC m=+1080.354002251" observedRunningTime="2025-10-08 18:28:44.983357289 +0000 UTC m=+1080.896328322" watchObservedRunningTime="2025-10-08 18:28:44.987858649 +0000 UTC m=+1080.900829652" Oct 08 18:28:45 crc kubenswrapper[4750]: I1008 18:28:45.668571 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 08 18:28:45 crc kubenswrapper[4750]: I1008 18:28:45.981107 4750 generic.go:334] "Generic (PLEG): container finished" podID="c121e821-4fcc-4ea5-9844-3fb345b952e9" containerID="48a37bdd5977b748b1a035850954139e3743138dd89b2a624758f9b8e7aaa270" exitCode=0 Oct 08 18:28:45 crc kubenswrapper[4750]: I1008 18:28:45.981462 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2d53-account-create-n4mq6" event={"ID":"c121e821-4fcc-4ea5-9844-3fb345b952e9","Type":"ContainerDied","Data":"48a37bdd5977b748b1a035850954139e3743138dd89b2a624758f9b8e7aaa270"} Oct 08 18:28:45 crc kubenswrapper[4750]: I1008 18:28:45.983434 4750 generic.go:334] "Generic (PLEG): container finished" podID="d3678610-ef6c-49b3-a2e5-73d5431c1c4d" containerID="76705f4c8d0d963f55e04b2fb8a57f860f108ccd02fe957f52712f5bcd141748" exitCode=0 Oct 08 18:28:45 crc kubenswrapper[4750]: I1008 18:28:45.984171 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f1c-account-create-6n79f" event={"ID":"d3678610-ef6c-49b3-a2e5-73d5431c1c4d","Type":"ContainerDied","Data":"76705f4c8d0d963f55e04b2fb8a57f860f108ccd02fe957f52712f5bcd141748"} Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.071318 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.137884 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-5h7nt"] Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.138157 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" podUID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" containerName="dnsmasq-dns" containerID="cri-o://dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1" gracePeriod=10 Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.591945 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.676778 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-dns-svc\") pod \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.677000 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-nb\") pod \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.677043 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k445j\" (UniqueName: \"kubernetes.io/projected/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-kube-api-access-k445j\") pod \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.677165 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-sb\") pod \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.677187 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-config\") pod \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\" (UID: \"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd\") " Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.685695 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-kube-api-access-k445j" (OuterVolumeSpecName: "kube-api-access-k445j") pod "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" (UID: "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd"). InnerVolumeSpecName "kube-api-access-k445j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.713628 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" (UID: "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.716485 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-config" (OuterVolumeSpecName: "config") pod "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" (UID: "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.720989 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" (UID: "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.727312 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" (UID: "cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.778715 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.778749 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.778760 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k445j\" (UniqueName: \"kubernetes.io/projected/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-kube-api-access-k445j\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.778770 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.778778 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.991611 4750 generic.go:334] "Generic (PLEG): container finished" podID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" containerID="dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1" exitCode=0 Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.991790 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.992380 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" event={"ID":"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd","Type":"ContainerDied","Data":"dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1"} Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.992405 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc9d58d7-5h7nt" event={"ID":"cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd","Type":"ContainerDied","Data":"05c7a8628fa174dfb5c12954721761f57d82a6f37af2c9ca2f2c3b10b11c8ffe"} Oct 08 18:28:46 crc kubenswrapper[4750]: I1008 18:28:46.992420 4750 scope.go:117] "RemoveContainer" containerID="dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.014476 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-5h7nt"] Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.018681 4750 scope.go:117] "RemoveContainer" containerID="e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.022378 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dc9d58d7-5h7nt"] Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.037821 4750 scope.go:117] "RemoveContainer" containerID="dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1" Oct 08 18:28:47 crc kubenswrapper[4750]: E1008 18:28:47.038378 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1\": container with ID starting with dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1 not found: ID does not exist" containerID="dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.038427 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1"} err="failed to get container status \"dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1\": rpc error: code = NotFound desc = could not find container \"dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1\": container with ID starting with dcce81343d71fda6f550b7284c38e3270c1494635cb276a8ed8f9fc53d7536e1 not found: ID does not exist" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.038456 4750 scope.go:117] "RemoveContainer" containerID="e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb" Oct 08 18:28:47 crc kubenswrapper[4750]: E1008 18:28:47.040142 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb\": container with ID starting with e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb not found: ID does not exist" containerID="e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.040180 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb"} err="failed to get container status \"e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb\": rpc error: code = NotFound desc = could not find container \"e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb\": container with ID starting with e6abeee43df4d68657ce1169ed98a35dc66805a2c36a59daad5eaa8caaac57cb not found: ID does not exist" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.384484 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2d53-account-create-n4mq6" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.391829 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f1c-account-create-6n79f" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.490187 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5bjz\" (UniqueName: \"kubernetes.io/projected/c121e821-4fcc-4ea5-9844-3fb345b952e9-kube-api-access-n5bjz\") pod \"c121e821-4fcc-4ea5-9844-3fb345b952e9\" (UID: \"c121e821-4fcc-4ea5-9844-3fb345b952e9\") " Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.490354 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9grf\" (UniqueName: \"kubernetes.io/projected/d3678610-ef6c-49b3-a2e5-73d5431c1c4d-kube-api-access-m9grf\") pod \"d3678610-ef6c-49b3-a2e5-73d5431c1c4d\" (UID: \"d3678610-ef6c-49b3-a2e5-73d5431c1c4d\") " Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.497753 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3678610-ef6c-49b3-a2e5-73d5431c1c4d-kube-api-access-m9grf" (OuterVolumeSpecName: "kube-api-access-m9grf") pod "d3678610-ef6c-49b3-a2e5-73d5431c1c4d" (UID: "d3678610-ef6c-49b3-a2e5-73d5431c1c4d"). InnerVolumeSpecName "kube-api-access-m9grf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.498490 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c121e821-4fcc-4ea5-9844-3fb345b952e9-kube-api-access-n5bjz" (OuterVolumeSpecName: "kube-api-access-n5bjz") pod "c121e821-4fcc-4ea5-9844-3fb345b952e9" (UID: "c121e821-4fcc-4ea5-9844-3fb345b952e9"). InnerVolumeSpecName "kube-api-access-n5bjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.592135 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9grf\" (UniqueName: \"kubernetes.io/projected/d3678610-ef6c-49b3-a2e5-73d5431c1c4d-kube-api-access-m9grf\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:47 crc kubenswrapper[4750]: I1008 18:28:47.592171 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5bjz\" (UniqueName: \"kubernetes.io/projected/c121e821-4fcc-4ea5-9844-3fb345b952e9-kube-api-access-n5bjz\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.000877 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f1c-account-create-6n79f" Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.007636 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f1c-account-create-6n79f" event={"ID":"d3678610-ef6c-49b3-a2e5-73d5431c1c4d","Type":"ContainerDied","Data":"9e8a1c23207134a14c29aa841ed7ce682a91db7739f6fd6f8ca5407f3abe9af7"} Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.007719 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e8a1c23207134a14c29aa841ed7ce682a91db7739f6fd6f8ca5407f3abe9af7" Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.009730 4750 generic.go:334] "Generic (PLEG): container finished" podID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerID="fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565" exitCode=0 Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.009829 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43a52313-747b-40a7-a7e0-9e18f3c97c42","Type":"ContainerDied","Data":"fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565"} Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.021611 4750 generic.go:334] "Generic (PLEG): container finished" podID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerID="40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2" exitCode=0 Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.021675 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b8108eb-834c-44bd-9f39-70c348388ab6","Type":"ContainerDied","Data":"40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2"} Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.041460 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2d53-account-create-n4mq6" event={"ID":"c121e821-4fcc-4ea5-9844-3fb345b952e9","Type":"ContainerDied","Data":"d27277089d4583debe0695c1ffa4a4ea1d180e7353e29015f1eef40e4c7a12c7"} Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.041508 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d27277089d4583debe0695c1ffa4a4ea1d180e7353e29015f1eef40e4c7a12c7" Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.041596 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2d53-account-create-n4mq6" Oct 08 18:28:48 crc kubenswrapper[4750]: I1008 18:28:48.745258 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" path="/var/lib/kubelet/pods/cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd/volumes" Oct 08 18:28:48 crc kubenswrapper[4750]: E1008 18:28:48.792871 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65af30d1_7ae5_485a_85d4_271a4642c2cf.slice\": RecentStats: unable to find data in memory cache]" Oct 08 18:28:49 crc kubenswrapper[4750]: I1008 18:28:49.049868 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43a52313-747b-40a7-a7e0-9e18f3c97c42","Type":"ContainerStarted","Data":"0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd"} Oct 08 18:28:49 crc kubenswrapper[4750]: I1008 18:28:49.050044 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:28:49 crc kubenswrapper[4750]: I1008 18:28:49.052013 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b8108eb-834c-44bd-9f39-70c348388ab6","Type":"ContainerStarted","Data":"521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21"} Oct 08 18:28:49 crc kubenswrapper[4750]: I1008 18:28:49.052195 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 18:28:49 crc kubenswrapper[4750]: I1008 18:28:49.072197 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.520879304 podStartE2EDuration="1m10.072181779s" podCreationTimestamp="2025-10-08 18:27:39 +0000 UTC" firstStartedPulling="2025-10-08 18:27:41.520510285 +0000 UTC m=+1017.433481298" lastFinishedPulling="2025-10-08 18:28:14.07181276 +0000 UTC m=+1049.984783773" observedRunningTime="2025-10-08 18:28:49.067991066 +0000 UTC m=+1084.980962119" watchObservedRunningTime="2025-10-08 18:28:49.072181779 +0000 UTC m=+1084.985152792" Oct 08 18:28:49 crc kubenswrapper[4750]: I1008 18:28:49.094313 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.94335349 podStartE2EDuration="1m11.094289681s" podCreationTimestamp="2025-10-08 18:27:38 +0000 UTC" firstStartedPulling="2025-10-08 18:27:40.538807589 +0000 UTC m=+1016.451778602" lastFinishedPulling="2025-10-08 18:28:13.68974377 +0000 UTC m=+1049.602714793" observedRunningTime="2025-10-08 18:28:49.089940015 +0000 UTC m=+1085.002911038" watchObservedRunningTime="2025-10-08 18:28:49.094289681 +0000 UTC m=+1085.007260694" Oct 08 18:28:52 crc kubenswrapper[4750]: I1008 18:28:52.676248 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:52 crc kubenswrapper[4750]: I1008 18:28:52.687158 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"swift-storage-0\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " pod="openstack/swift-storage-0" Oct 08 18:28:52 crc kubenswrapper[4750]: I1008 18:28:52.740532 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 18:28:53 crc kubenswrapper[4750]: I1008 18:28:53.080965 4750 generic.go:334] "Generic (PLEG): container finished" podID="b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" containerID="a7d155d2c3ef20e11a19f3b07465d37d9143db5dbca744ebcde32b355f9f1e8b" exitCode=0 Oct 08 18:28:53 crc kubenswrapper[4750]: I1008 18:28:53.081081 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fnr22" event={"ID":"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b","Type":"ContainerDied","Data":"a7d155d2c3ef20e11a19f3b07465d37d9143db5dbca744ebcde32b355f9f1e8b"} Oct 08 18:28:53 crc kubenswrapper[4750]: I1008 18:28:53.325898 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 18:28:53 crc kubenswrapper[4750]: W1008 18:28:53.337234 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe62333a_e650_4131_b0f1_c8c484539c7e.slice/crio-7c788d277c81526723ebafd7ec3acb36bcb8389596e6fee1747442ab8452d06a WatchSource:0}: Error finding container 7c788d277c81526723ebafd7ec3acb36bcb8389596e6fee1747442ab8452d06a: Status 404 returned error can't find the container with id 7c788d277c81526723ebafd7ec3acb36bcb8389596e6fee1747442ab8452d06a Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.088971 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"7c788d277c81526723ebafd7ec3acb36bcb8389596e6fee1747442ab8452d06a"} Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.412412 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.517817 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-ring-data-devices\") pod \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.517894 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-etc-swift\") pod \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.517992 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pm6b\" (UniqueName: \"kubernetes.io/projected/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-kube-api-access-5pm6b\") pod \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.518023 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-swiftconf\") pod \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.518054 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-combined-ca-bundle\") pod \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.518137 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-dispersionconf\") pod \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.518209 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-scripts\") pod \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\" (UID: \"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b\") " Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.519912 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" (UID: "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.520610 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" (UID: "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.528373 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-kube-api-access-5pm6b" (OuterVolumeSpecName: "kube-api-access-5pm6b") pod "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" (UID: "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b"). InnerVolumeSpecName "kube-api-access-5pm6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.531396 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" (UID: "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.563040 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" (UID: "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.564487 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-scripts" (OuterVolumeSpecName: "scripts") pod "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" (UID: "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.576766 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" (UID: "b7ca8f71-a72e-4a6d-b839-5524cfb3d70b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.620384 4750 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.620449 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.620479 4750 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.620494 4750 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.620504 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pm6b\" (UniqueName: \"kubernetes.io/projected/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-kube-api-access-5pm6b\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.620517 4750 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:54 crc kubenswrapper[4750]: I1008 18:28:54.620525 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.096444 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fnr22" event={"ID":"b7ca8f71-a72e-4a6d-b839-5524cfb3d70b","Type":"ContainerDied","Data":"28eb8d186c889b4179086a8294ceca8a974c84739f0ce52b0abb7bc9aafebca0"} Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.096486 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28eb8d186c889b4179086a8294ceca8a974c84739f0ce52b0abb7bc9aafebca0" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.096488 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fnr22" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.098668 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"52f4f1b7fcee564a61f7c59907f9f37ce855f6b9ccb07d9b5d4dea7ac7e8c754"} Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.098695 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"db381222476cb51a6b70b126857ee5e5eff04de66df2645f6737aa6584b15305"} Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.098706 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"d6cdce1fc3c750a32568c75f5f41fd036fc0c6d012614ac48dd3b89a28bddb19"} Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.463747 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mkxdr" podUID="cc808a1a-9703-4009-8d81-e555a8e25929" containerName="ovn-controller" probeResult="failure" output=< Oct 08 18:28:55 crc kubenswrapper[4750]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 08 18:28:55 crc kubenswrapper[4750]: > Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.521949 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.530196 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.758782 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mkxdr-config-5gfdq"] Oct 08 18:28:55 crc kubenswrapper[4750]: E1008 18:28:55.759162 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" containerName="swift-ring-rebalance" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759183 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" containerName="swift-ring-rebalance" Oct 08 18:28:55 crc kubenswrapper[4750]: E1008 18:28:55.759195 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65af30d1-7ae5-485a-85d4-271a4642c2cf" containerName="mariadb-database-create" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759203 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="65af30d1-7ae5-485a-85d4-271a4642c2cf" containerName="mariadb-database-create" Oct 08 18:28:55 crc kubenswrapper[4750]: E1008 18:28:55.759220 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" containerName="dnsmasq-dns" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759227 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" containerName="dnsmasq-dns" Oct 08 18:28:55 crc kubenswrapper[4750]: E1008 18:28:55.759244 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" containerName="init" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759249 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" containerName="init" Oct 08 18:28:55 crc kubenswrapper[4750]: E1008 18:28:55.759268 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3678610-ef6c-49b3-a2e5-73d5431c1c4d" containerName="mariadb-account-create" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759274 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3678610-ef6c-49b3-a2e5-73d5431c1c4d" containerName="mariadb-account-create" Oct 08 18:28:55 crc kubenswrapper[4750]: E1008 18:28:55.759289 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c121e821-4fcc-4ea5-9844-3fb345b952e9" containerName="mariadb-account-create" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759297 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c121e821-4fcc-4ea5-9844-3fb345b952e9" containerName="mariadb-account-create" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759462 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="65af30d1-7ae5-485a-85d4-271a4642c2cf" containerName="mariadb-database-create" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759534 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3678610-ef6c-49b3-a2e5-73d5431c1c4d" containerName="mariadb-account-create" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759571 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd42d4ae-ae85-4f2b-bcc3-57df8300c5bd" containerName="dnsmasq-dns" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759593 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c121e821-4fcc-4ea5-9844-3fb345b952e9" containerName="mariadb-account-create" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.759607 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" containerName="swift-ring-rebalance" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.760109 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.762423 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.776064 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mkxdr-config-5gfdq"] Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.845892 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-log-ovn\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.845932 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-additional-scripts\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.845976 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzxv\" (UniqueName: \"kubernetes.io/projected/0a236218-03eb-44be-8bcd-41838e2f2fa1-kube-api-access-kzzxv\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.845997 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.846019 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run-ovn\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.846126 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-scripts\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.947661 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-log-ovn\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.947720 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-additional-scripts\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.947764 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzxv\" (UniqueName: \"kubernetes.io/projected/0a236218-03eb-44be-8bcd-41838e2f2fa1-kube-api-access-kzzxv\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.947784 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.947803 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run-ovn\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.947876 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-scripts\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.948008 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-log-ovn\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.948035 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.948037 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run-ovn\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.948572 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-additional-scripts\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.949763 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-scripts\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:55 crc kubenswrapper[4750]: I1008 18:28:55.964765 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzxv\" (UniqueName: \"kubernetes.io/projected/0a236218-03eb-44be-8bcd-41838e2f2fa1-kube-api-access-kzzxv\") pod \"ovn-controller-mkxdr-config-5gfdq\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:56 crc kubenswrapper[4750]: I1008 18:28:56.079480 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:56 crc kubenswrapper[4750]: I1008 18:28:56.112671 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"f20129167bf9bc7eee86dd0ea9d1df44c416086aba87c4cfefb4ac0bb12dca9c"} Oct 08 18:28:56 crc kubenswrapper[4750]: I1008 18:28:56.504314 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mkxdr-config-5gfdq"] Oct 08 18:28:57 crc kubenswrapper[4750]: I1008 18:28:57.119884 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mkxdr-config-5gfdq" event={"ID":"0a236218-03eb-44be-8bcd-41838e2f2fa1","Type":"ContainerStarted","Data":"ebe89d7a9d130a41836498220be6762035e9a4c48a36eb99ea9cbd41a1b29372"} Oct 08 18:28:58 crc kubenswrapper[4750]: I1008 18:28:58.131980 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"0daa177c260aa7c024a5646c1d578e111a41bb913177c6089099d8076dcc1b13"} Oct 08 18:28:58 crc kubenswrapper[4750]: I1008 18:28:58.134321 4750 generic.go:334] "Generic (PLEG): container finished" podID="0a236218-03eb-44be-8bcd-41838e2f2fa1" containerID="ded8dda6ca3cc7f7f9717d2c3a2bb3364d14652de92ec33f826a73f6b5aead1d" exitCode=0 Oct 08 18:28:58 crc kubenswrapper[4750]: I1008 18:28:58.134358 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mkxdr-config-5gfdq" event={"ID":"0a236218-03eb-44be-8bcd-41838e2f2fa1","Type":"ContainerDied","Data":"ded8dda6ca3cc7f7f9717d2c3a2bb3364d14652de92ec33f826a73f6b5aead1d"} Oct 08 18:28:59 crc kubenswrapper[4750]: E1008 18:28:59.002409 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65af30d1_7ae5_485a_85d4_271a4642c2cf.slice\": RecentStats: unable to find data in memory cache]" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.156709 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"1a4ea5b2811f4515b0049bf0da3ae03b49c47b62fb6d2cddab227c8e93827963"} Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.156770 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"c068b85bb1ced1a9426fda1897141d98518bc162a709c23d2054cd4d7cce8209"} Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.156780 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"ec560a5ed96d75041eddd3e8cea8a27ed5d0b038772f918091c1f3453402ac4f"} Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.348916 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4c02-account-create-ht8qf"] Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.351802 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4c02-account-create-ht8qf" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.354999 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.365891 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4c02-account-create-ht8qf"] Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.402422 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bmn\" (UniqueName: \"kubernetes.io/projected/8bb0941b-83e9-4afb-87ed-a26326dfd400-kube-api-access-p5bmn\") pod \"glance-4c02-account-create-ht8qf\" (UID: \"8bb0941b-83e9-4afb-87ed-a26326dfd400\") " pod="openstack/glance-4c02-account-create-ht8qf" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.506324 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bmn\" (UniqueName: \"kubernetes.io/projected/8bb0941b-83e9-4afb-87ed-a26326dfd400-kube-api-access-p5bmn\") pod \"glance-4c02-account-create-ht8qf\" (UID: \"8bb0941b-83e9-4afb-87ed-a26326dfd400\") " pod="openstack/glance-4c02-account-create-ht8qf" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.541877 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bmn\" (UniqueName: \"kubernetes.io/projected/8bb0941b-83e9-4afb-87ed-a26326dfd400-kube-api-access-p5bmn\") pod \"glance-4c02-account-create-ht8qf\" (UID: \"8bb0941b-83e9-4afb-87ed-a26326dfd400\") " pod="openstack/glance-4c02-account-create-ht8qf" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.679434 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4c02-account-create-ht8qf" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.707587 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.707661 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.847991 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.912812 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-log-ovn\") pod \"0a236218-03eb-44be-8bcd-41838e2f2fa1\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.912872 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run-ovn\") pod \"0a236218-03eb-44be-8bcd-41838e2f2fa1\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.912904 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzzxv\" (UniqueName: \"kubernetes.io/projected/0a236218-03eb-44be-8bcd-41838e2f2fa1-kube-api-access-kzzxv\") pod \"0a236218-03eb-44be-8bcd-41838e2f2fa1\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.912918 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0a236218-03eb-44be-8bcd-41838e2f2fa1" (UID: "0a236218-03eb-44be-8bcd-41838e2f2fa1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.912968 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-scripts\") pod \"0a236218-03eb-44be-8bcd-41838e2f2fa1\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.913042 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run\") pod \"0a236218-03eb-44be-8bcd-41838e2f2fa1\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.913042 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0a236218-03eb-44be-8bcd-41838e2f2fa1" (UID: "0a236218-03eb-44be-8bcd-41838e2f2fa1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.913153 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-additional-scripts\") pod \"0a236218-03eb-44be-8bcd-41838e2f2fa1\" (UID: \"0a236218-03eb-44be-8bcd-41838e2f2fa1\") " Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.913559 4750 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.913601 4750 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.913758 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run" (OuterVolumeSpecName: "var-run") pod "0a236218-03eb-44be-8bcd-41838e2f2fa1" (UID: "0a236218-03eb-44be-8bcd-41838e2f2fa1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.914242 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0a236218-03eb-44be-8bcd-41838e2f2fa1" (UID: "0a236218-03eb-44be-8bcd-41838e2f2fa1"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.914599 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-scripts" (OuterVolumeSpecName: "scripts") pod "0a236218-03eb-44be-8bcd-41838e2f2fa1" (UID: "0a236218-03eb-44be-8bcd-41838e2f2fa1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.919279 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a236218-03eb-44be-8bcd-41838e2f2fa1-kube-api-access-kzzxv" (OuterVolumeSpecName: "kube-api-access-kzzxv") pod "0a236218-03eb-44be-8bcd-41838e2f2fa1" (UID: "0a236218-03eb-44be-8bcd-41838e2f2fa1"). InnerVolumeSpecName "kube-api-access-kzzxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:28:59 crc kubenswrapper[4750]: I1008 18:28:59.985310 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.025523 4750 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a236218-03eb-44be-8bcd-41838e2f2fa1-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.025558 4750 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.025591 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzzxv\" (UniqueName: \"kubernetes.io/projected/0a236218-03eb-44be-8bcd-41838e2f2fa1-kube-api-access-kzzxv\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.025601 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a236218-03eb-44be-8bcd-41838e2f2fa1-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.192724 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"b8997ccfc894343c6e131268824155df3c9f2e6d480b351d3f97c9aa1cd88c3f"} Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.193078 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"26c529adb950717fbb00bdeba5f0a5ebfb00cbe7017189d59c51fb3fb5a903e5"} Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.200158 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mkxdr-config-5gfdq" event={"ID":"0a236218-03eb-44be-8bcd-41838e2f2fa1","Type":"ContainerDied","Data":"ebe89d7a9d130a41836498220be6762035e9a4c48a36eb99ea9cbd41a1b29372"} Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.200207 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebe89d7a9d130a41836498220be6762035e9a4c48a36eb99ea9cbd41a1b29372" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.200320 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mkxdr-config-5gfdq" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.209386 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4c02-account-create-ht8qf"] Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.391654 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-llnpr"] Oct 08 18:29:00 crc kubenswrapper[4750]: E1008 18:29:00.392045 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a236218-03eb-44be-8bcd-41838e2f2fa1" containerName="ovn-config" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.392071 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a236218-03eb-44be-8bcd-41838e2f2fa1" containerName="ovn-config" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.392260 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a236218-03eb-44be-8bcd-41838e2f2fa1" containerName="ovn-config" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.392897 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-llnpr" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.403083 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-llnpr"] Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.485392 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cjhvk"] Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.487777 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cjhvk" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.506898 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cjhvk"] Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.527279 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mkxdr" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.533432 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drkps\" (UniqueName: \"kubernetes.io/projected/c3a80f86-2304-4b69-9cc8-8ef39c25c999-kube-api-access-drkps\") pod \"cinder-db-create-llnpr\" (UID: \"c3a80f86-2304-4b69-9cc8-8ef39c25c999\") " pod="openstack/cinder-db-create-llnpr" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.634697 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfg56\" (UniqueName: \"kubernetes.io/projected/8da1090b-4086-4e84-b252-982a0f876031-kube-api-access-zfg56\") pod \"barbican-db-create-cjhvk\" (UID: \"8da1090b-4086-4e84-b252-982a0f876031\") " pod="openstack/barbican-db-create-cjhvk" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.634769 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drkps\" (UniqueName: \"kubernetes.io/projected/c3a80f86-2304-4b69-9cc8-8ef39c25c999-kube-api-access-drkps\") pod \"cinder-db-create-llnpr\" (UID: \"c3a80f86-2304-4b69-9cc8-8ef39c25c999\") " pod="openstack/cinder-db-create-llnpr" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.683856 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drkps\" (UniqueName: \"kubernetes.io/projected/c3a80f86-2304-4b69-9cc8-8ef39c25c999-kube-api-access-drkps\") pod \"cinder-db-create-llnpr\" (UID: \"c3a80f86-2304-4b69-9cc8-8ef39c25c999\") " pod="openstack/cinder-db-create-llnpr" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.696881 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-kp6xp"] Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.697886 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kp6xp" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.705660 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-kp6xp"] Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.722865 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-llnpr" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.738880 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfg56\" (UniqueName: \"kubernetes.io/projected/8da1090b-4086-4e84-b252-982a0f876031-kube-api-access-zfg56\") pod \"barbican-db-create-cjhvk\" (UID: \"8da1090b-4086-4e84-b252-982a0f876031\") " pod="openstack/barbican-db-create-cjhvk" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.794353 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfg56\" (UniqueName: \"kubernetes.io/projected/8da1090b-4086-4e84-b252-982a0f876031-kube-api-access-zfg56\") pod \"barbican-db-create-cjhvk\" (UID: \"8da1090b-4086-4e84-b252-982a0f876031\") " pod="openstack/barbican-db-create-cjhvk" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.799343 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-d6q8c"] Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.800401 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.810284 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.810726 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rx2f7" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.810885 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.811085 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.813621 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cjhvk" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.826849 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-d6q8c"] Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.841325 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m829x\" (UniqueName: \"kubernetes.io/projected/3fff367e-d784-48f0-ad74-571e1587abbb-kube-api-access-m829x\") pod \"neutron-db-create-kp6xp\" (UID: \"3fff367e-d784-48f0-ad74-571e1587abbb\") " pod="openstack/neutron-db-create-kp6xp" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.841375 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swtlg\" (UniqueName: \"kubernetes.io/projected/72b55b3b-50b6-4ef2-81d2-801582731466-kube-api-access-swtlg\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.841462 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-combined-ca-bundle\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.841488 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-config-data\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.942860 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-combined-ca-bundle\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.942971 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-config-data\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.943037 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m829x\" (UniqueName: \"kubernetes.io/projected/3fff367e-d784-48f0-ad74-571e1587abbb-kube-api-access-m829x\") pod \"neutron-db-create-kp6xp\" (UID: \"3fff367e-d784-48f0-ad74-571e1587abbb\") " pod="openstack/neutron-db-create-kp6xp" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.943060 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swtlg\" (UniqueName: \"kubernetes.io/projected/72b55b3b-50b6-4ef2-81d2-801582731466-kube-api-access-swtlg\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.955777 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-config-data\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.955841 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-combined-ca-bundle\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.968878 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.995560 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swtlg\" (UniqueName: \"kubernetes.io/projected/72b55b3b-50b6-4ef2-81d2-801582731466-kube-api-access-swtlg\") pod \"keystone-db-sync-d6q8c\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:00 crc kubenswrapper[4750]: I1008 18:29:00.995634 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mkxdr-config-5gfdq"] Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.008174 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mkxdr-config-5gfdq"] Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.013312 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m829x\" (UniqueName: \"kubernetes.io/projected/3fff367e-d784-48f0-ad74-571e1587abbb-kube-api-access-m829x\") pod \"neutron-db-create-kp6xp\" (UID: \"3fff367e-d784-48f0-ad74-571e1587abbb\") " pod="openstack/neutron-db-create-kp6xp" Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.179972 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kp6xp" Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.185935 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.261828 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"3094942c3642c799d4db13b77a9964692bca3d841f52ece0573854c311f0f401"} Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.261888 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"3013b58e1d5ba309156a64f8003ce218400fb7a18be0cb766461f307776c2a76"} Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.261897 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"1d96354096952d01c21e9fe7d3f0bcc4a2f331e8ac84cd83d2b48f90d58cc965"} Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.263843 4750 generic.go:334] "Generic (PLEG): container finished" podID="8bb0941b-83e9-4afb-87ed-a26326dfd400" containerID="bc3334f37fde9a4e7df22df7f3d4e9f1274959c985f56c2a8e8a61845e708099" exitCode=0 Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.263883 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4c02-account-create-ht8qf" event={"ID":"8bb0941b-83e9-4afb-87ed-a26326dfd400","Type":"ContainerDied","Data":"bc3334f37fde9a4e7df22df7f3d4e9f1274959c985f56c2a8e8a61845e708099"} Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.263909 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4c02-account-create-ht8qf" event={"ID":"8bb0941b-83e9-4afb-87ed-a26326dfd400","Type":"ContainerStarted","Data":"708e2805a0d11569b75a5e393dde23ba1076ba02975f6cae7d9c5cea4780f3f2"} Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.530845 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cjhvk"] Oct 08 18:29:01 crc kubenswrapper[4750]: W1008 18:29:01.568697 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8da1090b_4086_4e84_b252_982a0f876031.slice/crio-189a68f32afd8e562047cf2b0d8075299a46e4b0de693602cc5d4cc7f0366a18 WatchSource:0}: Error finding container 189a68f32afd8e562047cf2b0d8075299a46e4b0de693602cc5d4cc7f0366a18: Status 404 returned error can't find the container with id 189a68f32afd8e562047cf2b0d8075299a46e4b0de693602cc5d4cc7f0366a18 Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.661625 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-llnpr"] Oct 08 18:29:01 crc kubenswrapper[4750]: W1008 18:29:01.669787 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3a80f86_2304_4b69_9cc8_8ef39c25c999.slice/crio-389bcea46c3f5cd7fb5c67637dba62340868de0c15202240ec5dd00ddba83ed6 WatchSource:0}: Error finding container 389bcea46c3f5cd7fb5c67637dba62340868de0c15202240ec5dd00ddba83ed6: Status 404 returned error can't find the container with id 389bcea46c3f5cd7fb5c67637dba62340868de0c15202240ec5dd00ddba83ed6 Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.848129 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-kp6xp"] Oct 08 18:29:01 crc kubenswrapper[4750]: I1008 18:29:01.960306 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-d6q8c"] Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.272703 4750 generic.go:334] "Generic (PLEG): container finished" podID="3fff367e-d784-48f0-ad74-571e1587abbb" containerID="1a4f284766309de7c3571dfcdbf004e019a0d6c456fedfed07930a33157eef86" exitCode=0 Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.272798 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kp6xp" event={"ID":"3fff367e-d784-48f0-ad74-571e1587abbb","Type":"ContainerDied","Data":"1a4f284766309de7c3571dfcdbf004e019a0d6c456fedfed07930a33157eef86"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.272834 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kp6xp" event={"ID":"3fff367e-d784-48f0-ad74-571e1587abbb","Type":"ContainerStarted","Data":"fa5ef960d1ca438a02fb5358f352d9e1e685250c1d47d3a1b952769e24566ce5"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.273970 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d6q8c" event={"ID":"72b55b3b-50b6-4ef2-81d2-801582731466","Type":"ContainerStarted","Data":"34291f871415d458a029852083c78f8c1ae21e8a0663fa0158685a2e6340475a"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.275917 4750 generic.go:334] "Generic (PLEG): container finished" podID="8da1090b-4086-4e84-b252-982a0f876031" containerID="6ba31a99af2f3e04fa9f91b5c6229c6f4e9c35bdad3860c3aa4d9c28c11045a0" exitCode=0 Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.275992 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cjhvk" event={"ID":"8da1090b-4086-4e84-b252-982a0f876031","Type":"ContainerDied","Data":"6ba31a99af2f3e04fa9f91b5c6229c6f4e9c35bdad3860c3aa4d9c28c11045a0"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.276018 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cjhvk" event={"ID":"8da1090b-4086-4e84-b252-982a0f876031","Type":"ContainerStarted","Data":"189a68f32afd8e562047cf2b0d8075299a46e4b0de693602cc5d4cc7f0366a18"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.287182 4750 generic.go:334] "Generic (PLEG): container finished" podID="c3a80f86-2304-4b69-9cc8-8ef39c25c999" containerID="ff70ec89fcd284727c94a8b206245c57250c99089e493a89641fa9ae7e6a9a5a" exitCode=0 Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.287285 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-llnpr" event={"ID":"c3a80f86-2304-4b69-9cc8-8ef39c25c999","Type":"ContainerDied","Data":"ff70ec89fcd284727c94a8b206245c57250c99089e493a89641fa9ae7e6a9a5a"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.287343 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-llnpr" event={"ID":"c3a80f86-2304-4b69-9cc8-8ef39c25c999","Type":"ContainerStarted","Data":"389bcea46c3f5cd7fb5c67637dba62340868de0c15202240ec5dd00ddba83ed6"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.303232 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"427cf714364cd21ce5dde409e29dd3aa65f33832204dfbf8ce289255b5e834c0"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.303271 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerStarted","Data":"d2840db45a1d2b1e671a91bd14ec16767e2beb441096549c8e394b831dc54350"} Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.366044 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.950255038 podStartE2EDuration="27.366026588s" podCreationTimestamp="2025-10-08 18:28:35 +0000 UTC" firstStartedPulling="2025-10-08 18:28:53.339666491 +0000 UTC m=+1089.252637504" lastFinishedPulling="2025-10-08 18:28:59.755438031 +0000 UTC m=+1095.668409054" observedRunningTime="2025-10-08 18:29:02.362718406 +0000 UTC m=+1098.275689439" watchObservedRunningTime="2025-10-08 18:29:02.366026588 +0000 UTC m=+1098.278997601" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.616166 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4c02-account-create-ht8qf" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.661429 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-9pvkl"] Oct 08 18:29:02 crc kubenswrapper[4750]: E1008 18:29:02.661920 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb0941b-83e9-4afb-87ed-a26326dfd400" containerName="mariadb-account-create" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.661946 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb0941b-83e9-4afb-87ed-a26326dfd400" containerName="mariadb-account-create" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.662191 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb0941b-83e9-4afb-87ed-a26326dfd400" containerName="mariadb-account-create" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.675328 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5bmn\" (UniqueName: \"kubernetes.io/projected/8bb0941b-83e9-4afb-87ed-a26326dfd400-kube-api-access-p5bmn\") pod \"8bb0941b-83e9-4afb-87ed-a26326dfd400\" (UID: \"8bb0941b-83e9-4afb-87ed-a26326dfd400\") " Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.684062 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.685057 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-9pvkl"] Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.687339 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.699874 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb0941b-83e9-4afb-87ed-a26326dfd400-kube-api-access-p5bmn" (OuterVolumeSpecName: "kube-api-access-p5bmn") pod "8bb0941b-83e9-4afb-87ed-a26326dfd400" (UID: "8bb0941b-83e9-4afb-87ed-a26326dfd400"). InnerVolumeSpecName "kube-api-access-p5bmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.750479 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a236218-03eb-44be-8bcd-41838e2f2fa1" path="/var/lib/kubelet/pods/0a236218-03eb-44be-8bcd-41838e2f2fa1/volumes" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.782788 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-swift-storage-0\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.782953 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-svc\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.783009 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-nb\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.783080 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-sb\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.783185 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-config\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.783359 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wvc4\" (UniqueName: \"kubernetes.io/projected/60e57b91-0d24-4264-ae0e-7e86b6737533-kube-api-access-5wvc4\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.784010 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5bmn\" (UniqueName: \"kubernetes.io/projected/8bb0941b-83e9-4afb-87ed-a26326dfd400-kube-api-access-p5bmn\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.885627 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wvc4\" (UniqueName: \"kubernetes.io/projected/60e57b91-0d24-4264-ae0e-7e86b6737533-kube-api-access-5wvc4\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.885735 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-swift-storage-0\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.885770 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-svc\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.885789 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-nb\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.885818 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-sb\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.885859 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-config\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.886809 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-config\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.888538 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-swift-storage-0\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.889068 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-svc\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.889609 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-nb\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.890498 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-sb\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:02 crc kubenswrapper[4750]: I1008 18:29:02.901759 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wvc4\" (UniqueName: \"kubernetes.io/projected/60e57b91-0d24-4264-ae0e-7e86b6737533-kube-api-access-5wvc4\") pod \"dnsmasq-dns-564965cbfc-9pvkl\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:03 crc kubenswrapper[4750]: I1008 18:29:03.064150 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:03.314662 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4c02-account-create-ht8qf" event={"ID":"8bb0941b-83e9-4afb-87ed-a26326dfd400","Type":"ContainerDied","Data":"708e2805a0d11569b75a5e393dde23ba1076ba02975f6cae7d9c5cea4780f3f2"} Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:03.315004 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="708e2805a0d11569b75a5e393dde23ba1076ba02975f6cae7d9c5cea4780f3f2" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:03.314883 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4c02-account-create-ht8qf" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:03.523150 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-9pvkl"] Oct 08 18:29:04 crc kubenswrapper[4750]: W1008 18:29:03.524794 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e57b91_0d24_4264_ae0e_7e86b6737533.slice/crio-7b4f273c25803364ee3686cdc31ad9f35d421bfe8e7f57a8c328f0292eb2e27e WatchSource:0}: Error finding container 7b4f273c25803364ee3686cdc31ad9f35d421bfe8e7f57a8c328f0292eb2e27e: Status 404 returned error can't find the container with id 7b4f273c25803364ee3686cdc31ad9f35d421bfe8e7f57a8c328f0292eb2e27e Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.330735 4750 generic.go:334] "Generic (PLEG): container finished" podID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerID="0e06d68c35a9b53210ad6f3b4f182efb67d2daaa8f174499773c0d6d4ffaa3fe" exitCode=0 Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.330843 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" event={"ID":"60e57b91-0d24-4264-ae0e-7e86b6737533","Type":"ContainerDied","Data":"0e06d68c35a9b53210ad6f3b4f182efb67d2daaa8f174499773c0d6d4ffaa3fe"} Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.331061 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" event={"ID":"60e57b91-0d24-4264-ae0e-7e86b6737533","Type":"ContainerStarted","Data":"7b4f273c25803364ee3686cdc31ad9f35d421bfe8e7f57a8c328f0292eb2e27e"} Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.569051 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6khnd"] Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.570392 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.573646 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s64q7" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.574781 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.583177 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6khnd"] Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.717438 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghc88\" (UniqueName: \"kubernetes.io/projected/7d635eca-619c-4f52-a9d9-73b42d845fbf-kube-api-access-ghc88\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.717664 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-combined-ca-bundle\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.717857 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-db-sync-config-data\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.717989 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-config-data\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.819195 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-db-sync-config-data\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.819262 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-config-data\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.819312 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghc88\" (UniqueName: \"kubernetes.io/projected/7d635eca-619c-4f52-a9d9-73b42d845fbf-kube-api-access-ghc88\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.819376 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-combined-ca-bundle\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.825221 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-config-data\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.827411 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-db-sync-config-data\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.835745 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-combined-ca-bundle\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.841416 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghc88\" (UniqueName: \"kubernetes.io/projected/7d635eca-619c-4f52-a9d9-73b42d845fbf-kube-api-access-ghc88\") pod \"glance-db-sync-6khnd\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:04 crc kubenswrapper[4750]: I1008 18:29:04.896335 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.266105 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6khnd"] Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.318767 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-llnpr" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.347323 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kp6xp" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.374513 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cjhvk" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.416028 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kp6xp" event={"ID":"3fff367e-d784-48f0-ad74-571e1587abbb","Type":"ContainerDied","Data":"fa5ef960d1ca438a02fb5358f352d9e1e685250c1d47d3a1b952769e24566ce5"} Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.416062 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5ef960d1ca438a02fb5358f352d9e1e685250c1d47d3a1b952769e24566ce5" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.416116 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kp6xp" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.422928 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cjhvk" event={"ID":"8da1090b-4086-4e84-b252-982a0f876031","Type":"ContainerDied","Data":"189a68f32afd8e562047cf2b0d8075299a46e4b0de693602cc5d4cc7f0366a18"} Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.422967 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="189a68f32afd8e562047cf2b0d8075299a46e4b0de693602cc5d4cc7f0366a18" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.422965 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cjhvk" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.427726 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-llnpr" event={"ID":"c3a80f86-2304-4b69-9cc8-8ef39c25c999","Type":"ContainerDied","Data":"389bcea46c3f5cd7fb5c67637dba62340868de0c15202240ec5dd00ddba83ed6"} Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.427764 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="389bcea46c3f5cd7fb5c67637dba62340868de0c15202240ec5dd00ddba83ed6" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.427817 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-llnpr" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.429963 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6khnd" event={"ID":"7d635eca-619c-4f52-a9d9-73b42d845fbf","Type":"ContainerStarted","Data":"0946de31366b62a20d1b790224db8593e76861fae75500292ee8bed2ad56c806"} Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.507444 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfg56\" (UniqueName: \"kubernetes.io/projected/8da1090b-4086-4e84-b252-982a0f876031-kube-api-access-zfg56\") pod \"8da1090b-4086-4e84-b252-982a0f876031\" (UID: \"8da1090b-4086-4e84-b252-982a0f876031\") " Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.507707 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drkps\" (UniqueName: \"kubernetes.io/projected/c3a80f86-2304-4b69-9cc8-8ef39c25c999-kube-api-access-drkps\") pod \"c3a80f86-2304-4b69-9cc8-8ef39c25c999\" (UID: \"c3a80f86-2304-4b69-9cc8-8ef39c25c999\") " Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.507774 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m829x\" (UniqueName: \"kubernetes.io/projected/3fff367e-d784-48f0-ad74-571e1587abbb-kube-api-access-m829x\") pod \"3fff367e-d784-48f0-ad74-571e1587abbb\" (UID: \"3fff367e-d784-48f0-ad74-571e1587abbb\") " Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.514024 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a80f86-2304-4b69-9cc8-8ef39c25c999-kube-api-access-drkps" (OuterVolumeSpecName: "kube-api-access-drkps") pod "c3a80f86-2304-4b69-9cc8-8ef39c25c999" (UID: "c3a80f86-2304-4b69-9cc8-8ef39c25c999"). InnerVolumeSpecName "kube-api-access-drkps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.514071 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fff367e-d784-48f0-ad74-571e1587abbb-kube-api-access-m829x" (OuterVolumeSpecName: "kube-api-access-m829x") pod "3fff367e-d784-48f0-ad74-571e1587abbb" (UID: "3fff367e-d784-48f0-ad74-571e1587abbb"). InnerVolumeSpecName "kube-api-access-m829x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.514114 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da1090b-4086-4e84-b252-982a0f876031-kube-api-access-zfg56" (OuterVolumeSpecName: "kube-api-access-zfg56") pod "8da1090b-4086-4e84-b252-982a0f876031" (UID: "8da1090b-4086-4e84-b252-982a0f876031"). InnerVolumeSpecName "kube-api-access-zfg56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.609205 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drkps\" (UniqueName: \"kubernetes.io/projected/c3a80f86-2304-4b69-9cc8-8ef39c25c999-kube-api-access-drkps\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.609239 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m829x\" (UniqueName: \"kubernetes.io/projected/3fff367e-d784-48f0-ad74-571e1587abbb-kube-api-access-m829x\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:07 crc kubenswrapper[4750]: I1008 18:29:07.609249 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfg56\" (UniqueName: \"kubernetes.io/projected/8da1090b-4086-4e84-b252-982a0f876031-kube-api-access-zfg56\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:08 crc kubenswrapper[4750]: I1008 18:29:08.446329 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d6q8c" event={"ID":"72b55b3b-50b6-4ef2-81d2-801582731466","Type":"ContainerStarted","Data":"f096f491816489e7c0702cbcc047978c74c8d6f41e3e5ccd8ce038ae33450c80"} Oct 08 18:29:08 crc kubenswrapper[4750]: I1008 18:29:08.448738 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" event={"ID":"60e57b91-0d24-4264-ae0e-7e86b6737533","Type":"ContainerStarted","Data":"1664403d475f5963b613c71bcf3dcf6101ef0fc025883199eede3357d755838e"} Oct 08 18:29:08 crc kubenswrapper[4750]: I1008 18:29:08.448865 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:08 crc kubenswrapper[4750]: I1008 18:29:08.468150 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-d6q8c" podStartSLOduration=3.420076365 podStartE2EDuration="8.468132985s" podCreationTimestamp="2025-10-08 18:29:00 +0000 UTC" firstStartedPulling="2025-10-08 18:29:02.018806062 +0000 UTC m=+1097.931777075" lastFinishedPulling="2025-10-08 18:29:07.066862682 +0000 UTC m=+1102.979833695" observedRunningTime="2025-10-08 18:29:08.461514722 +0000 UTC m=+1104.374485735" watchObservedRunningTime="2025-10-08 18:29:08.468132985 +0000 UTC m=+1104.381103988" Oct 08 18:29:08 crc kubenswrapper[4750]: I1008 18:29:08.482800 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" podStartSLOduration=6.482783204 podStartE2EDuration="6.482783204s" podCreationTimestamp="2025-10-08 18:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:29:08.480290553 +0000 UTC m=+1104.393261586" watchObservedRunningTime="2025-10-08 18:29:08.482783204 +0000 UTC m=+1104.395754217" Oct 08 18:29:09 crc kubenswrapper[4750]: E1008 18:29:09.203034 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65af30d1_7ae5_485a_85d4_271a4642c2cf.slice\": RecentStats: unable to find data in memory cache]" Oct 08 18:29:11 crc kubenswrapper[4750]: I1008 18:29:11.477203 4750 generic.go:334] "Generic (PLEG): container finished" podID="72b55b3b-50b6-4ef2-81d2-801582731466" containerID="f096f491816489e7c0702cbcc047978c74c8d6f41e3e5ccd8ce038ae33450c80" exitCode=0 Oct 08 18:29:11 crc kubenswrapper[4750]: I1008 18:29:11.477284 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d6q8c" event={"ID":"72b55b3b-50b6-4ef2-81d2-801582731466","Type":"ContainerDied","Data":"f096f491816489e7c0702cbcc047978c74c8d6f41e3e5ccd8ce038ae33450c80"} Oct 08 18:29:13 crc kubenswrapper[4750]: I1008 18:29:13.065687 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:13 crc kubenswrapper[4750]: I1008 18:29:13.129227 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-btv56"] Oct 08 18:29:13 crc kubenswrapper[4750]: I1008 18:29:13.129489 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" podUID="5e0538d0-2633-437d-b5e3-3a397000a601" containerName="dnsmasq-dns" containerID="cri-o://ef619e03de0bd8aa5ccbf31154f9e5ff769e7e2aabe7c00ba20d3fc73e445cc8" gracePeriod=10 Oct 08 18:29:13 crc kubenswrapper[4750]: I1008 18:29:13.510743 4750 generic.go:334] "Generic (PLEG): container finished" podID="5e0538d0-2633-437d-b5e3-3a397000a601" containerID="ef619e03de0bd8aa5ccbf31154f9e5ff769e7e2aabe7c00ba20d3fc73e445cc8" exitCode=0 Oct 08 18:29:13 crc kubenswrapper[4750]: I1008 18:29:13.511066 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" event={"ID":"5e0538d0-2633-437d-b5e3-3a397000a601","Type":"ContainerDied","Data":"ef619e03de0bd8aa5ccbf31154f9e5ff769e7e2aabe7c00ba20d3fc73e445cc8"} Oct 08 18:29:16 crc kubenswrapper[4750]: I1008 18:29:16.070599 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" podUID="5e0538d0-2633-437d-b5e3-3a397000a601" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.454082 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.547793 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d6q8c" event={"ID":"72b55b3b-50b6-4ef2-81d2-801582731466","Type":"ContainerDied","Data":"34291f871415d458a029852083c78f8c1ae21e8a0663fa0158685a2e6340475a"} Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.547827 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34291f871415d458a029852083c78f8c1ae21e8a0663fa0158685a2e6340475a" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.547842 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d6q8c" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.604928 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-combined-ca-bundle\") pod \"72b55b3b-50b6-4ef2-81d2-801582731466\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.604999 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-config-data\") pod \"72b55b3b-50b6-4ef2-81d2-801582731466\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.605091 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swtlg\" (UniqueName: \"kubernetes.io/projected/72b55b3b-50b6-4ef2-81d2-801582731466-kube-api-access-swtlg\") pod \"72b55b3b-50b6-4ef2-81d2-801582731466\" (UID: \"72b55b3b-50b6-4ef2-81d2-801582731466\") " Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.624604 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b55b3b-50b6-4ef2-81d2-801582731466-kube-api-access-swtlg" (OuterVolumeSpecName: "kube-api-access-swtlg") pod "72b55b3b-50b6-4ef2-81d2-801582731466" (UID: "72b55b3b-50b6-4ef2-81d2-801582731466"). InnerVolumeSpecName "kube-api-access-swtlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.631424 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72b55b3b-50b6-4ef2-81d2-801582731466" (UID: "72b55b3b-50b6-4ef2-81d2-801582731466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.656564 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-config-data" (OuterVolumeSpecName: "config-data") pod "72b55b3b-50b6-4ef2-81d2-801582731466" (UID: "72b55b3b-50b6-4ef2-81d2-801582731466"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.706925 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.706961 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b55b3b-50b6-4ef2-81d2-801582731466-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.706971 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swtlg\" (UniqueName: \"kubernetes.io/projected/72b55b3b-50b6-4ef2-81d2-801582731466-kube-api-access-swtlg\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.724448 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.909370 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-sb\") pod \"5e0538d0-2633-437d-b5e3-3a397000a601\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.909757 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-dns-svc\") pod \"5e0538d0-2633-437d-b5e3-3a397000a601\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.909830 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-config\") pod \"5e0538d0-2633-437d-b5e3-3a397000a601\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.909877 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2d6j\" (UniqueName: \"kubernetes.io/projected/5e0538d0-2633-437d-b5e3-3a397000a601-kube-api-access-h2d6j\") pod \"5e0538d0-2633-437d-b5e3-3a397000a601\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.909932 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-nb\") pod \"5e0538d0-2633-437d-b5e3-3a397000a601\" (UID: \"5e0538d0-2633-437d-b5e3-3a397000a601\") " Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.914171 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0538d0-2633-437d-b5e3-3a397000a601-kube-api-access-h2d6j" (OuterVolumeSpecName: "kube-api-access-h2d6j") pod "5e0538d0-2633-437d-b5e3-3a397000a601" (UID: "5e0538d0-2633-437d-b5e3-3a397000a601"). InnerVolumeSpecName "kube-api-access-h2d6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.948612 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e0538d0-2633-437d-b5e3-3a397000a601" (UID: "5e0538d0-2633-437d-b5e3-3a397000a601"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.948632 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-config" (OuterVolumeSpecName: "config") pod "5e0538d0-2633-437d-b5e3-3a397000a601" (UID: "5e0538d0-2633-437d-b5e3-3a397000a601"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.949145 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e0538d0-2633-437d-b5e3-3a397000a601" (UID: "5e0538d0-2633-437d-b5e3-3a397000a601"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:18 crc kubenswrapper[4750]: I1008 18:29:18.953212 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e0538d0-2633-437d-b5e3-3a397000a601" (UID: "5e0538d0-2633-437d-b5e3-3a397000a601"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.011869 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.011894 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2d6j\" (UniqueName: \"kubernetes.io/projected/5e0538d0-2633-437d-b5e3-3a397000a601-kube-api-access-h2d6j\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.011905 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.011914 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.011922 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e0538d0-2633-437d-b5e3-3a397000a601-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:19 crc kubenswrapper[4750]: E1008 18:29:19.436271 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65af30d1_7ae5_485a_85d4_271a4642c2cf.slice\": RecentStats: unable to find data in memory cache]" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.557920 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" event={"ID":"5e0538d0-2633-437d-b5e3-3a397000a601","Type":"ContainerDied","Data":"2fe25efd24d87a2a3273adb23444542dd2cbc48a800642866621f8a3ce99e616"} Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.557969 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b587f8db7-btv56" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.557982 4750 scope.go:117] "RemoveContainer" containerID="ef619e03de0bd8aa5ccbf31154f9e5ff769e7e2aabe7c00ba20d3fc73e445cc8" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.586960 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-btv56"] Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.597432 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b587f8db7-btv56"] Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.700245 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-7jnsb"] Oct 08 18:29:19 crc kubenswrapper[4750]: E1008 18:29:19.704820 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a80f86-2304-4b69-9cc8-8ef39c25c999" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.704867 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a80f86-2304-4b69-9cc8-8ef39c25c999" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: E1008 18:29:19.704888 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b55b3b-50b6-4ef2-81d2-801582731466" containerName="keystone-db-sync" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.704897 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b55b3b-50b6-4ef2-81d2-801582731466" containerName="keystone-db-sync" Oct 08 18:29:19 crc kubenswrapper[4750]: E1008 18:29:19.704910 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fff367e-d784-48f0-ad74-571e1587abbb" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.704918 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fff367e-d784-48f0-ad74-571e1587abbb" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: E1008 18:29:19.704963 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0538d0-2633-437d-b5e3-3a397000a601" containerName="init" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.704971 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0538d0-2633-437d-b5e3-3a397000a601" containerName="init" Oct 08 18:29:19 crc kubenswrapper[4750]: E1008 18:29:19.704979 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da1090b-4086-4e84-b252-982a0f876031" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.704987 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da1090b-4086-4e84-b252-982a0f876031" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: E1008 18:29:19.704998 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0538d0-2633-437d-b5e3-3a397000a601" containerName="dnsmasq-dns" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.705007 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0538d0-2633-437d-b5e3-3a397000a601" containerName="dnsmasq-dns" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.705302 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b55b3b-50b6-4ef2-81d2-801582731466" containerName="keystone-db-sync" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.705318 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da1090b-4086-4e84-b252-982a0f876031" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.705331 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a80f86-2304-4b69-9cc8-8ef39c25c999" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.705345 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fff367e-d784-48f0-ad74-571e1587abbb" containerName="mariadb-database-create" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.705358 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0538d0-2633-437d-b5e3-3a397000a601" containerName="dnsmasq-dns" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.706448 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.708460 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-7jnsb"] Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.764979 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-c7w47"] Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.766125 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.771778 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.771822 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rx2f7" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.771890 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.771787 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.785928 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c7w47"] Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.824240 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-sb\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.826817 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-config\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.829905 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-swift-storage-0\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.830016 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-svc\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.830091 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mhmn\" (UniqueName: \"kubernetes.io/projected/4507e93f-fbeb-439e-9306-ecad48b6363f-kube-api-access-6mhmn\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.830168 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-nb\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.883654 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.886263 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.889590 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.889742 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.912378 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933638 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933717 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-swift-storage-0\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933738 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7v6k\" (UniqueName: \"kubernetes.io/projected/9aae9be2-a135-41c7-84ba-998d959afd39-kube-api-access-j7v6k\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933760 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-combined-ca-bundle\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933779 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933817 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-config-data\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933843 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-svc\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933873 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933893 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933913 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mhmn\" (UniqueName: \"kubernetes.io/projected/4507e93f-fbeb-439e-9306-ecad48b6363f-kube-api-access-6mhmn\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933929 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-config-data\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933962 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-scripts\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933976 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6wx\" (UniqueName: \"kubernetes.io/projected/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-kube-api-access-5m6wx\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.933996 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-nb\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.934021 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-credential-keys\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.934044 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-sb\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.934066 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-scripts\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.934116 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-config\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.934134 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-fernet-keys\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.935605 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-swift-storage-0\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.936198 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-nb\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.936353 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-svc\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.936910 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-sb\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.937152 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-config\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:19 crc kubenswrapper[4750]: I1008 18:29:19.966940 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mhmn\" (UniqueName: \"kubernetes.io/projected/4507e93f-fbeb-439e-9306-ecad48b6363f-kube-api-access-6mhmn\") pod \"dnsmasq-dns-6877b6c9cc-7jnsb\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.014348 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-7jnsb"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.015158 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.034917 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.034958 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.034988 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-config-data\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035024 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-scripts\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035044 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6wx\" (UniqueName: \"kubernetes.io/projected/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-kube-api-access-5m6wx\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035077 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-credential-keys\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035109 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-scripts\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035147 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-fernet-keys\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035167 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035198 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7v6k\" (UniqueName: \"kubernetes.io/projected/9aae9be2-a135-41c7-84ba-998d959afd39-kube-api-access-j7v6k\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035220 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-combined-ca-bundle\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035235 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.035256 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-config-data\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.038086 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.039517 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.045719 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-combined-ca-bundle\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.047244 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-4vw9x"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.056413 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-scripts\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.056786 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-credential-keys\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.057481 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-config-data\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.062272 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-scripts\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.067270 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-config-data\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.070993 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.077789 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-fernet-keys\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.078848 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.080450 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7v6k\" (UniqueName: \"kubernetes.io/projected/9aae9be2-a135-41c7-84ba-998d959afd39-kube-api-access-j7v6k\") pod \"keystone-bootstrap-c7w47\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.080911 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.087049 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6wx\" (UniqueName: \"kubernetes.io/projected/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-kube-api-access-5m6wx\") pod \"ceilometer-0\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.098351 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.106383 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-4vw9x"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.136801 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-swift-storage-0\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.136864 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-svc\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.136884 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-sb\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.136902 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-config\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.136935 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-nb\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.137238 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d524l\" (UniqueName: \"kubernetes.io/projected/6964604e-3015-4e12-b64e-83b733af5806-kube-api-access-d524l\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.146338 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rlsgw"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.147847 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.150216 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.150420 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.151638 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8swdj" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.157094 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rlsgw"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.213082 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.223268 4750 scope.go:117] "RemoveContainer" containerID="c3a7105f7b8bd3c9e326bdba4e4939b8c39174000473360506f5ab0a1cabd421" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239353 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-combined-ca-bundle\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239412 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-svc\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239438 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-sb\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239457 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-config\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239497 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-nb\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239517 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-config-data\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239560 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfe0b7d-f014-4453-b94c-3842ebdd4052-logs\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239598 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d524l\" (UniqueName: \"kubernetes.io/projected/6964604e-3015-4e12-b64e-83b733af5806-kube-api-access-d524l\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239710 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-scripts\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239754 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cnv7\" (UniqueName: \"kubernetes.io/projected/bdfe0b7d-f014-4453-b94c-3842ebdd4052-kube-api-access-4cnv7\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.239794 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-swift-storage-0\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.240855 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-sb\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.241041 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-nb\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.241269 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-swift-storage-0\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.241441 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-svc\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.241734 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-config\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.262017 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d524l\" (UniqueName: \"kubernetes.io/projected/6964604e-3015-4e12-b64e-83b733af5806-kube-api-access-d524l\") pod \"dnsmasq-dns-7d96c67b5-4vw9x\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.340790 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-scripts\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.341958 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cnv7\" (UniqueName: \"kubernetes.io/projected/bdfe0b7d-f014-4453-b94c-3842ebdd4052-kube-api-access-4cnv7\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.342004 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-combined-ca-bundle\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.342060 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-config-data\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.342092 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfe0b7d-f014-4453-b94c-3842ebdd4052-logs\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.342440 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfe0b7d-f014-4453-b94c-3842ebdd4052-logs\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.346114 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-combined-ca-bundle\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.346681 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-scripts\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.347332 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-config-data\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.358728 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cnv7\" (UniqueName: \"kubernetes.io/projected/bdfe0b7d-f014-4453-b94c-3842ebdd4052-kube-api-access-4cnv7\") pod \"placement-db-sync-rlsgw\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.426004 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7a82-account-create-cwfdm"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.427287 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7a82-account-create-cwfdm" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.429332 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.447780 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klw6h\" (UniqueName: \"kubernetes.io/projected/ce3df91c-8c7d-4adc-be58-34b388a95c93-kube-api-access-klw6h\") pod \"cinder-7a82-account-create-cwfdm\" (UID: \"ce3df91c-8c7d-4adc-be58-34b388a95c93\") " pod="openstack/cinder-7a82-account-create-cwfdm" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.448047 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7a82-account-create-cwfdm"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.462009 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.468785 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rlsgw" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.549179 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klw6h\" (UniqueName: \"kubernetes.io/projected/ce3df91c-8c7d-4adc-be58-34b388a95c93-kube-api-access-klw6h\") pod \"cinder-7a82-account-create-cwfdm\" (UID: \"ce3df91c-8c7d-4adc-be58-34b388a95c93\") " pod="openstack/cinder-7a82-account-create-cwfdm" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.573818 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klw6h\" (UniqueName: \"kubernetes.io/projected/ce3df91c-8c7d-4adc-be58-34b388a95c93-kube-api-access-klw6h\") pod \"cinder-7a82-account-create-cwfdm\" (UID: \"ce3df91c-8c7d-4adc-be58-34b388a95c93\") " pod="openstack/cinder-7a82-account-create-cwfdm" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.615655 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5ee7-account-create-6qh5z"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.617836 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ee7-account-create-6qh5z" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.619648 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.625006 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5ee7-account-create-6qh5z"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.652263 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgrkh\" (UniqueName: \"kubernetes.io/projected/341402ad-8d92-4e75-825c-1849b5f99f4d-kube-api-access-sgrkh\") pod \"barbican-5ee7-account-create-6qh5z\" (UID: \"341402ad-8d92-4e75-825c-1849b5f99f4d\") " pod="openstack/barbican-5ee7-account-create-6qh5z" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.684728 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7a82-account-create-cwfdm" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.745262 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0538d0-2633-437d-b5e3-3a397000a601" path="/var/lib/kubelet/pods/5e0538d0-2633-437d-b5e3-3a397000a601/volumes" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.755708 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgrkh\" (UniqueName: \"kubernetes.io/projected/341402ad-8d92-4e75-825c-1849b5f99f4d-kube-api-access-sgrkh\") pod \"barbican-5ee7-account-create-6qh5z\" (UID: \"341402ad-8d92-4e75-825c-1849b5f99f4d\") " pod="openstack/barbican-5ee7-account-create-6qh5z" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.779458 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgrkh\" (UniqueName: \"kubernetes.io/projected/341402ad-8d92-4e75-825c-1849b5f99f4d-kube-api-access-sgrkh\") pod \"barbican-5ee7-account-create-6qh5z\" (UID: \"341402ad-8d92-4e75-825c-1849b5f99f4d\") " pod="openstack/barbican-5ee7-account-create-6qh5z" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.819284 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0c01-account-create-wk5nl"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.822352 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0c01-account-create-wk5nl" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.825860 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.832748 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0c01-account-create-wk5nl"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.856504 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7hs2\" (UniqueName: \"kubernetes.io/projected/e094d330-b345-4b12-bf02-7e8d55307fce-kube-api-access-b7hs2\") pod \"neutron-0c01-account-create-wk5nl\" (UID: \"e094d330-b345-4b12-bf02-7e8d55307fce\") " pod="openstack/neutron-0c01-account-create-wk5nl" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.899892 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c7w47"] Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.958162 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7hs2\" (UniqueName: \"kubernetes.io/projected/e094d330-b345-4b12-bf02-7e8d55307fce-kube-api-access-b7hs2\") pod \"neutron-0c01-account-create-wk5nl\" (UID: \"e094d330-b345-4b12-bf02-7e8d55307fce\") " pod="openstack/neutron-0c01-account-create-wk5nl" Oct 08 18:29:20 crc kubenswrapper[4750]: I1008 18:29:20.975762 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7hs2\" (UniqueName: \"kubernetes.io/projected/e094d330-b345-4b12-bf02-7e8d55307fce-kube-api-access-b7hs2\") pod \"neutron-0c01-account-create-wk5nl\" (UID: \"e094d330-b345-4b12-bf02-7e8d55307fce\") " pod="openstack/neutron-0c01-account-create-wk5nl" Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.000737 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ee7-account-create-6qh5z" Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.046804 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-4vw9x"] Oct 08 18:29:21 crc kubenswrapper[4750]: W1008 18:29:21.059526 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ddbbe8f_20e9_4f77_a3da_ecec1f700a5a.slice/crio-3988604e7459185f046c114054933e191f3654a1e3fd4ba326cda80726ed88d7 WatchSource:0}: Error finding container 3988604e7459185f046c114054933e191f3654a1e3fd4ba326cda80726ed88d7: Status 404 returned error can't find the container with id 3988604e7459185f046c114054933e191f3654a1e3fd4ba326cda80726ed88d7 Oct 08 18:29:21 crc kubenswrapper[4750]: W1008 18:29:21.060813 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6964604e_3015_4e12_b64e_83b733af5806.slice/crio-2da2c22e424740b35f06edb3ecb57422e1e1fbb04927860662c02ad8a0f3003b WatchSource:0}: Error finding container 2da2c22e424740b35f06edb3ecb57422e1e1fbb04927860662c02ad8a0f3003b: Status 404 returned error can't find the container with id 2da2c22e424740b35f06edb3ecb57422e1e1fbb04927860662c02ad8a0f3003b Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.063812 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.075913 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-7jnsb"] Oct 08 18:29:21 crc kubenswrapper[4750]: W1008 18:29:21.082149 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4507e93f_fbeb_439e_9306_ecad48b6363f.slice/crio-f61a7dd93ac450ca37f98da63629590373213b9ad32e9a2a04e6ca9e743ce64b WatchSource:0}: Error finding container f61a7dd93ac450ca37f98da63629590373213b9ad32e9a2a04e6ca9e743ce64b: Status 404 returned error can't find the container with id f61a7dd93ac450ca37f98da63629590373213b9ad32e9a2a04e6ca9e743ce64b Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.158040 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0c01-account-create-wk5nl" Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.192661 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rlsgw"] Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.290684 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7a82-account-create-cwfdm"] Oct 08 18:29:21 crc kubenswrapper[4750]: W1008 18:29:21.298634 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3df91c_8c7d_4adc_be58_34b388a95c93.slice/crio-38afa049c8c9fc032d6a28b877a09a3199a04034f3ff629459d8a6cabe25c636 WatchSource:0}: Error finding container 38afa049c8c9fc032d6a28b877a09a3199a04034f3ff629459d8a6cabe25c636: Status 404 returned error can't find the container with id 38afa049c8c9fc032d6a28b877a09a3199a04034f3ff629459d8a6cabe25c636 Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.530074 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5ee7-account-create-6qh5z"] Oct 08 18:29:21 crc kubenswrapper[4750]: W1008 18:29:21.559793 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod341402ad_8d92_4e75_825c_1849b5f99f4d.slice/crio-93a1aadc8691155ab700b95abb59e27be3172c97da4e3bda85ac589fa345fad1 WatchSource:0}: Error finding container 93a1aadc8691155ab700b95abb59e27be3172c97da4e3bda85ac589fa345fad1: Status 404 returned error can't find the container with id 93a1aadc8691155ab700b95abb59e27be3172c97da4e3bda85ac589fa345fad1 Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.586145 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7w47" event={"ID":"9aae9be2-a135-41c7-84ba-998d959afd39","Type":"ContainerStarted","Data":"ca1124b65f9b238ccebf542a0599c167610983e694ed8e3367e1a9718e1ed196"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.586187 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7w47" event={"ID":"9aae9be2-a135-41c7-84ba-998d959afd39","Type":"ContainerStarted","Data":"d2d97e0c8dab98a820e7b55db42e22a8e4a4f29814d8e58d774eb1aa61a32a39"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.587819 4750 generic.go:334] "Generic (PLEG): container finished" podID="4507e93f-fbeb-439e-9306-ecad48b6363f" containerID="bd0e881d47e50aae5a0a496d5e9e36dadcf5b80aa19f123c6c826d6a03feb779" exitCode=0 Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.587896 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" event={"ID":"4507e93f-fbeb-439e-9306-ecad48b6363f","Type":"ContainerDied","Data":"bd0e881d47e50aae5a0a496d5e9e36dadcf5b80aa19f123c6c826d6a03feb779"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.587925 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" event={"ID":"4507e93f-fbeb-439e-9306-ecad48b6363f","Type":"ContainerStarted","Data":"f61a7dd93ac450ca37f98da63629590373213b9ad32e9a2a04e6ca9e743ce64b"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.592738 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ee7-account-create-6qh5z" event={"ID":"341402ad-8d92-4e75-825c-1849b5f99f4d","Type":"ContainerStarted","Data":"93a1aadc8691155ab700b95abb59e27be3172c97da4e3bda85ac589fa345fad1"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.594313 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerStarted","Data":"3988604e7459185f046c114054933e191f3654a1e3fd4ba326cda80726ed88d7"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.596792 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6khnd" event={"ID":"7d635eca-619c-4f52-a9d9-73b42d845fbf","Type":"ContainerStarted","Data":"c8bcb751be31666c14a48d5c24330a20b0c43fc84ee99554962325b0311ad93b"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.598676 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7a82-account-create-cwfdm" event={"ID":"ce3df91c-8c7d-4adc-be58-34b388a95c93","Type":"ContainerStarted","Data":"38afa049c8c9fc032d6a28b877a09a3199a04034f3ff629459d8a6cabe25c636"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.600489 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rlsgw" event={"ID":"bdfe0b7d-f014-4453-b94c-3842ebdd4052","Type":"ContainerStarted","Data":"ca9acb8a2135375602fc4f60a280f3534a9e91d88f140b9f0f01249cf88e0178"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.604765 4750 generic.go:334] "Generic (PLEG): container finished" podID="6964604e-3015-4e12-b64e-83b733af5806" containerID="211393a548b77a9fa2cc9169548d12ee0b3ad4a628304edd0df0cbecc8987904" exitCode=0 Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.604804 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" event={"ID":"6964604e-3015-4e12-b64e-83b733af5806","Type":"ContainerDied","Data":"211393a548b77a9fa2cc9169548d12ee0b3ad4a628304edd0df0cbecc8987904"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.604824 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" event={"ID":"6964604e-3015-4e12-b64e-83b733af5806","Type":"ContainerStarted","Data":"2da2c22e424740b35f06edb3ecb57422e1e1fbb04927860662c02ad8a0f3003b"} Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.607696 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-c7w47" podStartSLOduration=2.607681387 podStartE2EDuration="2.607681387s" podCreationTimestamp="2025-10-08 18:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:29:21.605936454 +0000 UTC m=+1117.518907467" watchObservedRunningTime="2025-10-08 18:29:21.607681387 +0000 UTC m=+1117.520652410" Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.677766 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6khnd" podStartSLOduration=4.65978113 podStartE2EDuration="17.677716733s" podCreationTimestamp="2025-10-08 18:29:04 +0000 UTC" firstStartedPulling="2025-10-08 18:29:07.316263212 +0000 UTC m=+1103.229234225" lastFinishedPulling="2025-10-08 18:29:20.334198815 +0000 UTC m=+1116.247169828" observedRunningTime="2025-10-08 18:29:21.670387723 +0000 UTC m=+1117.583358756" watchObservedRunningTime="2025-10-08 18:29:21.677716733 +0000 UTC m=+1117.590687746" Oct 08 18:29:21 crc kubenswrapper[4750]: I1008 18:29:21.730900 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0c01-account-create-wk5nl"] Oct 08 18:29:21 crc kubenswrapper[4750]: W1008 18:29:21.730989 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode094d330_b345_4b12_bf02_7e8d55307fce.slice/crio-0e6a3d50cfe0fd0104284fe8bfd253d12a8705105da5d9a46709ae21b40e1f85 WatchSource:0}: Error finding container 0e6a3d50cfe0fd0104284fe8bfd253d12a8705105da5d9a46709ae21b40e1f85: Status 404 returned error can't find the container with id 0e6a3d50cfe0fd0104284fe8bfd253d12a8705105da5d9a46709ae21b40e1f85 Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.062057 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.178005 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-nb\") pod \"4507e93f-fbeb-439e-9306-ecad48b6363f\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.178048 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-svc\") pod \"4507e93f-fbeb-439e-9306-ecad48b6363f\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.178127 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-config\") pod \"4507e93f-fbeb-439e-9306-ecad48b6363f\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.178182 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mhmn\" (UniqueName: \"kubernetes.io/projected/4507e93f-fbeb-439e-9306-ecad48b6363f-kube-api-access-6mhmn\") pod \"4507e93f-fbeb-439e-9306-ecad48b6363f\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.178248 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-sb\") pod \"4507e93f-fbeb-439e-9306-ecad48b6363f\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.178264 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-swift-storage-0\") pod \"4507e93f-fbeb-439e-9306-ecad48b6363f\" (UID: \"4507e93f-fbeb-439e-9306-ecad48b6363f\") " Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.183524 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4507e93f-fbeb-439e-9306-ecad48b6363f-kube-api-access-6mhmn" (OuterVolumeSpecName: "kube-api-access-6mhmn") pod "4507e93f-fbeb-439e-9306-ecad48b6363f" (UID: "4507e93f-fbeb-439e-9306-ecad48b6363f"). InnerVolumeSpecName "kube-api-access-6mhmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.203421 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4507e93f-fbeb-439e-9306-ecad48b6363f" (UID: "4507e93f-fbeb-439e-9306-ecad48b6363f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.203450 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4507e93f-fbeb-439e-9306-ecad48b6363f" (UID: "4507e93f-fbeb-439e-9306-ecad48b6363f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.204468 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-config" (OuterVolumeSpecName: "config") pod "4507e93f-fbeb-439e-9306-ecad48b6363f" (UID: "4507e93f-fbeb-439e-9306-ecad48b6363f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.207609 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4507e93f-fbeb-439e-9306-ecad48b6363f" (UID: "4507e93f-fbeb-439e-9306-ecad48b6363f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.213050 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4507e93f-fbeb-439e-9306-ecad48b6363f" (UID: "4507e93f-fbeb-439e-9306-ecad48b6363f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.282836 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.282877 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.282886 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.282895 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mhmn\" (UniqueName: \"kubernetes.io/projected/4507e93f-fbeb-439e-9306-ecad48b6363f-kube-api-access-6mhmn\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.282906 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.282914 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4507e93f-fbeb-439e-9306-ecad48b6363f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.369675 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.614022 4750 generic.go:334] "Generic (PLEG): container finished" podID="341402ad-8d92-4e75-825c-1849b5f99f4d" containerID="a053f897eeff89050ab907acb6fc0ae927f0d046b3f849e913ee782c384e7d81" exitCode=0 Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.614078 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ee7-account-create-6qh5z" event={"ID":"341402ad-8d92-4e75-825c-1849b5f99f4d","Type":"ContainerDied","Data":"a053f897eeff89050ab907acb6fc0ae927f0d046b3f849e913ee782c384e7d81"} Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.615864 4750 generic.go:334] "Generic (PLEG): container finished" podID="e094d330-b345-4b12-bf02-7e8d55307fce" containerID="cf4c2892fadc655bf219082514f3fe07d61bd9425e6e92254e3b9651f1597404" exitCode=0 Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.615946 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0c01-account-create-wk5nl" event={"ID":"e094d330-b345-4b12-bf02-7e8d55307fce","Type":"ContainerDied","Data":"cf4c2892fadc655bf219082514f3fe07d61bd9425e6e92254e3b9651f1597404"} Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.615976 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0c01-account-create-wk5nl" event={"ID":"e094d330-b345-4b12-bf02-7e8d55307fce","Type":"ContainerStarted","Data":"0e6a3d50cfe0fd0104284fe8bfd253d12a8705105da5d9a46709ae21b40e1f85"} Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.625149 4750 generic.go:334] "Generic (PLEG): container finished" podID="ce3df91c-8c7d-4adc-be58-34b388a95c93" containerID="9b81f0da23f03118e4fbf3e11325fa3d29005ce8f6200d3a78e18a1d7bbc7cf8" exitCode=0 Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.625222 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7a82-account-create-cwfdm" event={"ID":"ce3df91c-8c7d-4adc-be58-34b388a95c93","Type":"ContainerDied","Data":"9b81f0da23f03118e4fbf3e11325fa3d29005ce8f6200d3a78e18a1d7bbc7cf8"} Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.637262 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" event={"ID":"6964604e-3015-4e12-b64e-83b733af5806","Type":"ContainerStarted","Data":"279cc2214cdfe5a50ef21d37a8076ed4f7cba2124d492e32458479978f538a7a"} Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.637405 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.644624 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.645778 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6877b6c9cc-7jnsb" event={"ID":"4507e93f-fbeb-439e-9306-ecad48b6363f","Type":"ContainerDied","Data":"f61a7dd93ac450ca37f98da63629590373213b9ad32e9a2a04e6ca9e743ce64b"} Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.645926 4750 scope.go:117] "RemoveContainer" containerID="bd0e881d47e50aae5a0a496d5e9e36dadcf5b80aa19f123c6c826d6a03feb779" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.689324 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" podStartSLOduration=2.6893055280000002 podStartE2EDuration="2.689305528s" podCreationTimestamp="2025-10-08 18:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:29:22.684213062 +0000 UTC m=+1118.597184095" watchObservedRunningTime="2025-10-08 18:29:22.689305528 +0000 UTC m=+1118.602276541" Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.748264 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-7jnsb"] Oct 08 18:29:22 crc kubenswrapper[4750]: I1008 18:29:22.759279 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6877b6c9cc-7jnsb"] Oct 08 18:29:24 crc kubenswrapper[4750]: I1008 18:29:24.745498 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4507e93f-fbeb-439e-9306-ecad48b6363f" path="/var/lib/kubelet/pods/4507e93f-fbeb-439e-9306-ecad48b6363f/volumes" Oct 08 18:29:29 crc kubenswrapper[4750]: E1008 18:29:29.668172 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65af30d1_7ae5_485a_85d4_271a4642c2cf.slice\": RecentStats: unable to find data in memory cache]" Oct 08 18:29:29 crc kubenswrapper[4750]: I1008 18:29:29.707713 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:29:29 crc kubenswrapper[4750]: I1008 18:29:29.708085 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:29:30 crc kubenswrapper[4750]: I1008 18:29:30.465277 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:29:30 crc kubenswrapper[4750]: I1008 18:29:30.521419 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-9pvkl"] Oct 08 18:29:30 crc kubenswrapper[4750]: I1008 18:29:30.521874 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" podUID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerName="dnsmasq-dns" containerID="cri-o://1664403d475f5963b613c71bcf3dcf6101ef0fc025883199eede3357d755838e" gracePeriod=10 Oct 08 18:29:31 crc kubenswrapper[4750]: I1008 18:29:31.727040 4750 generic.go:334] "Generic (PLEG): container finished" podID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerID="1664403d475f5963b613c71bcf3dcf6101ef0fc025883199eede3357d755838e" exitCode=0 Oct 08 18:29:31 crc kubenswrapper[4750]: I1008 18:29:31.727135 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" event={"ID":"60e57b91-0d24-4264-ae0e-7e86b6737533","Type":"ContainerDied","Data":"1664403d475f5963b613c71bcf3dcf6101ef0fc025883199eede3357d755838e"} Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.162518 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ee7-account-create-6qh5z" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.172908 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7a82-account-create-cwfdm" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.351076 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klw6h\" (UniqueName: \"kubernetes.io/projected/ce3df91c-8c7d-4adc-be58-34b388a95c93-kube-api-access-klw6h\") pod \"ce3df91c-8c7d-4adc-be58-34b388a95c93\" (UID: \"ce3df91c-8c7d-4adc-be58-34b388a95c93\") " Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.351299 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgrkh\" (UniqueName: \"kubernetes.io/projected/341402ad-8d92-4e75-825c-1849b5f99f4d-kube-api-access-sgrkh\") pod \"341402ad-8d92-4e75-825c-1849b5f99f4d\" (UID: \"341402ad-8d92-4e75-825c-1849b5f99f4d\") " Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.358183 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3df91c-8c7d-4adc-be58-34b388a95c93-kube-api-access-klw6h" (OuterVolumeSpecName: "kube-api-access-klw6h") pod "ce3df91c-8c7d-4adc-be58-34b388a95c93" (UID: "ce3df91c-8c7d-4adc-be58-34b388a95c93"). InnerVolumeSpecName "kube-api-access-klw6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.358244 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/341402ad-8d92-4e75-825c-1849b5f99f4d-kube-api-access-sgrkh" (OuterVolumeSpecName: "kube-api-access-sgrkh") pod "341402ad-8d92-4e75-825c-1849b5f99f4d" (UID: "341402ad-8d92-4e75-825c-1849b5f99f4d"). InnerVolumeSpecName "kube-api-access-sgrkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.455892 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klw6h\" (UniqueName: \"kubernetes.io/projected/ce3df91c-8c7d-4adc-be58-34b388a95c93-kube-api-access-klw6h\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.455931 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgrkh\" (UniqueName: \"kubernetes.io/projected/341402ad-8d92-4e75-825c-1849b5f99f4d-kube-api-access-sgrkh\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.737392 4750 generic.go:334] "Generic (PLEG): container finished" podID="9aae9be2-a135-41c7-84ba-998d959afd39" containerID="ca1124b65f9b238ccebf542a0599c167610983e694ed8e3367e1a9718e1ed196" exitCode=0 Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.739258 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ee7-account-create-6qh5z" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.740898 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7a82-account-create-cwfdm" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.746145 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7w47" event={"ID":"9aae9be2-a135-41c7-84ba-998d959afd39","Type":"ContainerDied","Data":"ca1124b65f9b238ccebf542a0599c167610983e694ed8e3367e1a9718e1ed196"} Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.746184 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ee7-account-create-6qh5z" event={"ID":"341402ad-8d92-4e75-825c-1849b5f99f4d","Type":"ContainerDied","Data":"93a1aadc8691155ab700b95abb59e27be3172c97da4e3bda85ac589fa345fad1"} Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.746202 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93a1aadc8691155ab700b95abb59e27be3172c97da4e3bda85ac589fa345fad1" Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.746213 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7a82-account-create-cwfdm" event={"ID":"ce3df91c-8c7d-4adc-be58-34b388a95c93","Type":"ContainerDied","Data":"38afa049c8c9fc032d6a28b877a09a3199a04034f3ff629459d8a6cabe25c636"} Oct 08 18:29:32 crc kubenswrapper[4750]: I1008 18:29:32.746225 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38afa049c8c9fc032d6a28b877a09a3199a04034f3ff629459d8a6cabe25c636" Oct 08 18:29:33 crc kubenswrapper[4750]: E1008 18:29:33.497343 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:59448516174fc3bab679b9a8dd62cb9a9d16b5734aadbeb98e960e3b7c79bd22" Oct 08 18:29:33 crc kubenswrapper[4750]: E1008 18:29:33.497547 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:59448516174fc3bab679b9a8dd62cb9a9d16b5734aadbeb98e960e3b7c79bd22,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cnv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-rlsgw_openstack(bdfe0b7d-f014-4453-b94c-3842ebdd4052): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 18:29:33 crc kubenswrapper[4750]: E1008 18:29:33.498828 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-rlsgw" podUID="bdfe0b7d-f014-4453-b94c-3842ebdd4052" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.701184 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0c01-account-create-wk5nl" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.709190 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.756134 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0c01-account-create-wk5nl" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.756164 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0c01-account-create-wk5nl" event={"ID":"e094d330-b345-4b12-bf02-7e8d55307fce","Type":"ContainerDied","Data":"0e6a3d50cfe0fd0104284fe8bfd253d12a8705105da5d9a46709ae21b40e1f85"} Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.756221 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e6a3d50cfe0fd0104284fe8bfd253d12a8705105da5d9a46709ae21b40e1f85" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.758747 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.758895 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" event={"ID":"60e57b91-0d24-4264-ae0e-7e86b6737533","Type":"ContainerDied","Data":"7b4f273c25803364ee3686cdc31ad9f35d421bfe8e7f57a8c328f0292eb2e27e"} Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.758951 4750 scope.go:117] "RemoveContainer" containerID="1664403d475f5963b613c71bcf3dcf6101ef0fc025883199eede3357d755838e" Oct 08 18:29:33 crc kubenswrapper[4750]: E1008 18:29:33.759409 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:59448516174fc3bab679b9a8dd62cb9a9d16b5734aadbeb98e960e3b7c79bd22\\\"\"" pod="openstack/placement-db-sync-rlsgw" podUID="bdfe0b7d-f014-4453-b94c-3842ebdd4052" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.775608 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-nb\") pod \"60e57b91-0d24-4264-ae0e-7e86b6737533\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.775681 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wvc4\" (UniqueName: \"kubernetes.io/projected/60e57b91-0d24-4264-ae0e-7e86b6737533-kube-api-access-5wvc4\") pod \"60e57b91-0d24-4264-ae0e-7e86b6737533\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.775708 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-svc\") pod \"60e57b91-0d24-4264-ae0e-7e86b6737533\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.790039 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e57b91-0d24-4264-ae0e-7e86b6737533-kube-api-access-5wvc4" (OuterVolumeSpecName: "kube-api-access-5wvc4") pod "60e57b91-0d24-4264-ae0e-7e86b6737533" (UID: "60e57b91-0d24-4264-ae0e-7e86b6737533"). InnerVolumeSpecName "kube-api-access-5wvc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.793748 4750 scope.go:117] "RemoveContainer" containerID="0e06d68c35a9b53210ad6f3b4f182efb67d2daaa8f174499773c0d6d4ffaa3fe" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.821879 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60e57b91-0d24-4264-ae0e-7e86b6737533" (UID: "60e57b91-0d24-4264-ae0e-7e86b6737533"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.828561 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60e57b91-0d24-4264-ae0e-7e86b6737533" (UID: "60e57b91-0d24-4264-ae0e-7e86b6737533"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.877049 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-swift-storage-0\") pod \"60e57b91-0d24-4264-ae0e-7e86b6737533\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.877433 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-config\") pod \"60e57b91-0d24-4264-ae0e-7e86b6737533\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.877457 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-sb\") pod \"60e57b91-0d24-4264-ae0e-7e86b6737533\" (UID: \"60e57b91-0d24-4264-ae0e-7e86b6737533\") " Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.877501 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7hs2\" (UniqueName: \"kubernetes.io/projected/e094d330-b345-4b12-bf02-7e8d55307fce-kube-api-access-b7hs2\") pod \"e094d330-b345-4b12-bf02-7e8d55307fce\" (UID: \"e094d330-b345-4b12-bf02-7e8d55307fce\") " Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.878296 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.878311 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wvc4\" (UniqueName: \"kubernetes.io/projected/60e57b91-0d24-4264-ae0e-7e86b6737533-kube-api-access-5wvc4\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.878321 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.885755 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e094d330-b345-4b12-bf02-7e8d55307fce-kube-api-access-b7hs2" (OuterVolumeSpecName: "kube-api-access-b7hs2") pod "e094d330-b345-4b12-bf02-7e8d55307fce" (UID: "e094d330-b345-4b12-bf02-7e8d55307fce"). InnerVolumeSpecName "kube-api-access-b7hs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.920643 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-config" (OuterVolumeSpecName: "config") pod "60e57b91-0d24-4264-ae0e-7e86b6737533" (UID: "60e57b91-0d24-4264-ae0e-7e86b6737533"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.920763 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60e57b91-0d24-4264-ae0e-7e86b6737533" (UID: "60e57b91-0d24-4264-ae0e-7e86b6737533"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.952387 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "60e57b91-0d24-4264-ae0e-7e86b6737533" (UID: "60e57b91-0d24-4264-ae0e-7e86b6737533"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.980946 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.981007 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7hs2\" (UniqueName: \"kubernetes.io/projected/e094d330-b345-4b12-bf02-7e8d55307fce-kube-api-access-b7hs2\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.981022 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:33 crc kubenswrapper[4750]: I1008 18:29:33.981035 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e57b91-0d24-4264-ae0e-7e86b6737533-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.019467 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.081462 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-fernet-keys\") pod \"9aae9be2-a135-41c7-84ba-998d959afd39\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.081509 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-combined-ca-bundle\") pod \"9aae9be2-a135-41c7-84ba-998d959afd39\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.081533 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-credential-keys\") pod \"9aae9be2-a135-41c7-84ba-998d959afd39\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.081577 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7v6k\" (UniqueName: \"kubernetes.io/projected/9aae9be2-a135-41c7-84ba-998d959afd39-kube-api-access-j7v6k\") pod \"9aae9be2-a135-41c7-84ba-998d959afd39\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.081610 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-config-data\") pod \"9aae9be2-a135-41c7-84ba-998d959afd39\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.081652 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-scripts\") pod \"9aae9be2-a135-41c7-84ba-998d959afd39\" (UID: \"9aae9be2-a135-41c7-84ba-998d959afd39\") " Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.091120 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-scripts" (OuterVolumeSpecName: "scripts") pod "9aae9be2-a135-41c7-84ba-998d959afd39" (UID: "9aae9be2-a135-41c7-84ba-998d959afd39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.093840 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9aae9be2-a135-41c7-84ba-998d959afd39" (UID: "9aae9be2-a135-41c7-84ba-998d959afd39"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.096542 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-9pvkl"] Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.102934 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-564965cbfc-9pvkl"] Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.107295 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aae9be2-a135-41c7-84ba-998d959afd39-kube-api-access-j7v6k" (OuterVolumeSpecName: "kube-api-access-j7v6k") pod "9aae9be2-a135-41c7-84ba-998d959afd39" (UID: "9aae9be2-a135-41c7-84ba-998d959afd39"). InnerVolumeSpecName "kube-api-access-j7v6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.107444 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9aae9be2-a135-41c7-84ba-998d959afd39" (UID: "9aae9be2-a135-41c7-84ba-998d959afd39"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.110531 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-config-data" (OuterVolumeSpecName: "config-data") pod "9aae9be2-a135-41c7-84ba-998d959afd39" (UID: "9aae9be2-a135-41c7-84ba-998d959afd39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.114806 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9aae9be2-a135-41c7-84ba-998d959afd39" (UID: "9aae9be2-a135-41c7-84ba-998d959afd39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.183073 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.183107 4750 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.183116 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.183129 4750 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.183137 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7v6k\" (UniqueName: \"kubernetes.io/projected/9aae9be2-a135-41c7-84ba-998d959afd39-kube-api-access-j7v6k\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.183145 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aae9be2-a135-41c7-84ba-998d959afd39-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.753118 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e57b91-0d24-4264-ae0e-7e86b6737533" path="/var/lib/kubelet/pods/60e57b91-0d24-4264-ae0e-7e86b6737533/volumes" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.770507 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7w47" event={"ID":"9aae9be2-a135-41c7-84ba-998d959afd39","Type":"ContainerDied","Data":"d2d97e0c8dab98a820e7b55db42e22a8e4a4f29814d8e58d774eb1aa61a32a39"} Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.770574 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2d97e0c8dab98a820e7b55db42e22a8e4a4f29814d8e58d774eb1aa61a32a39" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.770588 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7w47" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.773623 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerStarted","Data":"c1307d892a277faff4013cf95858ca39bd78853297e2048184e8f1c187e0201b"} Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.826721 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-c7w47"] Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.833905 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-c7w47"] Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.949866 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-w4fvs"] Oct 08 18:29:34 crc kubenswrapper[4750]: E1008 18:29:34.950179 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerName="dnsmasq-dns" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950196 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerName="dnsmasq-dns" Oct 08 18:29:34 crc kubenswrapper[4750]: E1008 18:29:34.950206 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341402ad-8d92-4e75-825c-1849b5f99f4d" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950212 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="341402ad-8d92-4e75-825c-1849b5f99f4d" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: E1008 18:29:34.950233 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aae9be2-a135-41c7-84ba-998d959afd39" containerName="keystone-bootstrap" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950240 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aae9be2-a135-41c7-84ba-998d959afd39" containerName="keystone-bootstrap" Oct 08 18:29:34 crc kubenswrapper[4750]: E1008 18:29:34.950250 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerName="init" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950256 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerName="init" Oct 08 18:29:34 crc kubenswrapper[4750]: E1008 18:29:34.950268 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3df91c-8c7d-4adc-be58-34b388a95c93" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950273 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3df91c-8c7d-4adc-be58-34b388a95c93" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: E1008 18:29:34.950288 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e094d330-b345-4b12-bf02-7e8d55307fce" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950294 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e094d330-b345-4b12-bf02-7e8d55307fce" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: E1008 18:29:34.950303 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4507e93f-fbeb-439e-9306-ecad48b6363f" containerName="init" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950308 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4507e93f-fbeb-439e-9306-ecad48b6363f" containerName="init" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950454 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerName="dnsmasq-dns" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950467 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3df91c-8c7d-4adc-be58-34b388a95c93" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950480 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="341402ad-8d92-4e75-825c-1849b5f99f4d" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950489 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e094d330-b345-4b12-bf02-7e8d55307fce" containerName="mariadb-account-create" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950495 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4507e93f-fbeb-439e-9306-ecad48b6363f" containerName="init" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.950502 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aae9be2-a135-41c7-84ba-998d959afd39" containerName="keystone-bootstrap" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.951033 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.953900 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.954186 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.954314 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rx2f7" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.959885 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.964403 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w4fvs"] Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.995661 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-combined-ca-bundle\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.995718 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-config-data\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.995849 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-credential-keys\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.995985 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzcfk\" (UniqueName: \"kubernetes.io/projected/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-kube-api-access-hzcfk\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.996072 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-scripts\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:34 crc kubenswrapper[4750]: I1008 18:29:34.996114 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-fernet-keys\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.097029 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-scripts\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.097088 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-fernet-keys\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.097135 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-combined-ca-bundle\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.097213 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-config-data\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.097259 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-credential-keys\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.097326 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzcfk\" (UniqueName: \"kubernetes.io/projected/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-kube-api-access-hzcfk\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.101249 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-combined-ca-bundle\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.101875 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-config-data\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.102068 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-scripts\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.102861 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-credential-keys\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.107187 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-fernet-keys\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.114036 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzcfk\" (UniqueName: \"kubernetes.io/projected/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-kube-api-access-hzcfk\") pod \"keystone-bootstrap-w4fvs\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.271627 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.657603 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-scvbb"] Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.658981 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.661702 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s54km" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.661807 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.662119 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.675472 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-scvbb"] Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.732794 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w4fvs"] Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.786875 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4fvs" event={"ID":"3ad0cb6f-78d3-48ea-943f-ef07e0b52886","Type":"ContainerStarted","Data":"83aca028d9c525e961f00219cf6d2d332229d1b90532db8eced1f51adcbb5867"} Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.788956 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerStarted","Data":"32e882a4c502a439a479c8ea6d79da722ef53840fad55da94fdcab4bd196ec16"} Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.807834 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-db-sync-config-data\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.807926 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9hq\" (UniqueName: \"kubernetes.io/projected/c9e2d08e-75b7-445b-b563-affabf6d8af6-kube-api-access-dn9hq\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.807946 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-combined-ca-bundle\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.807990 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e2d08e-75b7-445b-b563-affabf6d8af6-etc-machine-id\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.808026 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-scripts\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.808059 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-config-data\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.909212 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-db-sync-config-data\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.909337 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9hq\" (UniqueName: \"kubernetes.io/projected/c9e2d08e-75b7-445b-b563-affabf6d8af6-kube-api-access-dn9hq\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.909364 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-combined-ca-bundle\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.909414 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e2d08e-75b7-445b-b563-affabf6d8af6-etc-machine-id\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.909441 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-scripts\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.909487 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-config-data\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.909751 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ndlps"] Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.910009 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e2d08e-75b7-445b-b563-affabf6d8af6-etc-machine-id\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.911419 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.913532 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n4lv8" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.913711 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-scripts\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.913813 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.914220 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-combined-ca-bundle\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.915577 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-config-data\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.918071 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-db-sync-config-data\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.925723 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ndlps"] Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.944443 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9hq\" (UniqueName: \"kubernetes.io/projected/c9e2d08e-75b7-445b-b563-affabf6d8af6-kube-api-access-dn9hq\") pod \"cinder-db-sync-scvbb\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:35 crc kubenswrapper[4750]: I1008 18:29:35.983006 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-scvbb" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.011029 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-db-sync-config-data\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.011085 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xfcg\" (UniqueName: \"kubernetes.io/projected/aae9b192-27dc-4bad-a9ae-6e03824c59f0-kube-api-access-5xfcg\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.011192 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-combined-ca-bundle\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.051039 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dd8dn"] Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.052319 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.054304 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.054506 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.055846 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-j2bkb" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.062898 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dd8dn"] Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.113415 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xfcg\" (UniqueName: \"kubernetes.io/projected/aae9b192-27dc-4bad-a9ae-6e03824c59f0-kube-api-access-5xfcg\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.113545 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-combined-ca-bundle\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.113652 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-db-sync-config-data\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.117803 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-db-sync-config-data\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.118514 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-combined-ca-bundle\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.130315 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xfcg\" (UniqueName: \"kubernetes.io/projected/aae9b192-27dc-4bad-a9ae-6e03824c59f0-kube-api-access-5xfcg\") pod \"barbican-db-sync-ndlps\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.214721 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x72l\" (UniqueName: \"kubernetes.io/projected/41b4bf15-850b-4082-8053-aee61b75dc58-kube-api-access-6x72l\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.214769 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-combined-ca-bundle\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.214994 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-config\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.267907 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ndlps" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.316705 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x72l\" (UniqueName: \"kubernetes.io/projected/41b4bf15-850b-4082-8053-aee61b75dc58-kube-api-access-6x72l\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.316760 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-combined-ca-bundle\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.316818 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-config\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.324492 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-combined-ca-bundle\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.327862 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-config\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.332867 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x72l\" (UniqueName: \"kubernetes.io/projected/41b4bf15-850b-4082-8053-aee61b75dc58-kube-api-access-6x72l\") pod \"neutron-db-sync-dd8dn\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.374072 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.471833 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-scvbb"] Oct 08 18:29:36 crc kubenswrapper[4750]: W1008 18:29:36.483310 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e2d08e_75b7_445b_b563_affabf6d8af6.slice/crio-452a29d9d307706cab1d8c8b5a9e21cfabaa134cb4287bdd0aa50bc007ca14fd WatchSource:0}: Error finding container 452a29d9d307706cab1d8c8b5a9e21cfabaa134cb4287bdd0aa50bc007ca14fd: Status 404 returned error can't find the container with id 452a29d9d307706cab1d8c8b5a9e21cfabaa134cb4287bdd0aa50bc007ca14fd Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.708541 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ndlps"] Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.781748 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aae9be2-a135-41c7-84ba-998d959afd39" path="/var/lib/kubelet/pods/9aae9be2-a135-41c7-84ba-998d959afd39/volumes" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.823168 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-scvbb" event={"ID":"c9e2d08e-75b7-445b-b563-affabf6d8af6","Type":"ContainerStarted","Data":"452a29d9d307706cab1d8c8b5a9e21cfabaa134cb4287bdd0aa50bc007ca14fd"} Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.854690 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ndlps" event={"ID":"aae9b192-27dc-4bad-a9ae-6e03824c59f0","Type":"ContainerStarted","Data":"5798dfb85a096f205a57a70bb1a79ca1dad98561cc11adcb3425f5efb115bbed"} Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.867604 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4fvs" event={"ID":"3ad0cb6f-78d3-48ea-943f-ef07e0b52886","Type":"ContainerStarted","Data":"30b8df0973d91b6cf5ac25e5502db9dbe44d6285631679c52bbd4ed5d43e219d"} Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.908219 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-w4fvs" podStartSLOduration=2.908196584 podStartE2EDuration="2.908196584s" podCreationTimestamp="2025-10-08 18:29:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:29:36.902251178 +0000 UTC m=+1132.815222191" watchObservedRunningTime="2025-10-08 18:29:36.908196584 +0000 UTC m=+1132.821167617" Oct 08 18:29:36 crc kubenswrapper[4750]: I1008 18:29:36.911716 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dd8dn"] Oct 08 18:29:37 crc kubenswrapper[4750]: I1008 18:29:37.883582 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dd8dn" event={"ID":"41b4bf15-850b-4082-8053-aee61b75dc58","Type":"ContainerStarted","Data":"e0eb306813cd83380792baf93d28642815dbbaa7ce0d625844aac67c665b8844"} Oct 08 18:29:37 crc kubenswrapper[4750]: I1008 18:29:37.883918 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dd8dn" event={"ID":"41b4bf15-850b-4082-8053-aee61b75dc58","Type":"ContainerStarted","Data":"beacea0cf8e43c114ef7cdb955f49ed9c5dd283d74693b64b725bf9f77656ab2"} Oct 08 18:29:37 crc kubenswrapper[4750]: I1008 18:29:37.885851 4750 generic.go:334] "Generic (PLEG): container finished" podID="7d635eca-619c-4f52-a9d9-73b42d845fbf" containerID="c8bcb751be31666c14a48d5c24330a20b0c43fc84ee99554962325b0311ad93b" exitCode=0 Oct 08 18:29:37 crc kubenswrapper[4750]: I1008 18:29:37.885942 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6khnd" event={"ID":"7d635eca-619c-4f52-a9d9-73b42d845fbf","Type":"ContainerDied","Data":"c8bcb751be31666c14a48d5c24330a20b0c43fc84ee99554962325b0311ad93b"} Oct 08 18:29:37 crc kubenswrapper[4750]: I1008 18:29:37.904361 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dd8dn" podStartSLOduration=1.904345011 podStartE2EDuration="1.904345011s" podCreationTimestamp="2025-10-08 18:29:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:29:37.901137662 +0000 UTC m=+1133.814108675" watchObservedRunningTime="2025-10-08 18:29:37.904345011 +0000 UTC m=+1133.817316024" Oct 08 18:29:38 crc kubenswrapper[4750]: I1008 18:29:38.065388 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-564965cbfc-9pvkl" podUID="60e57b91-0d24-4264-ae0e-7e86b6737533" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: i/o timeout" Oct 08 18:29:39 crc kubenswrapper[4750]: E1008 18:29:39.928845 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65af30d1_7ae5_485a_85d4_271a4642c2cf.slice\": RecentStats: unable to find data in memory cache]" Oct 08 18:29:40 crc kubenswrapper[4750]: I1008 18:29:40.851455 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:40 crc kubenswrapper[4750]: I1008 18:29:40.918744 4750 generic.go:334] "Generic (PLEG): container finished" podID="3ad0cb6f-78d3-48ea-943f-ef07e0b52886" containerID="30b8df0973d91b6cf5ac25e5502db9dbe44d6285631679c52bbd4ed5d43e219d" exitCode=0 Oct 08 18:29:40 crc kubenswrapper[4750]: I1008 18:29:40.918812 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4fvs" event={"ID":"3ad0cb6f-78d3-48ea-943f-ef07e0b52886","Type":"ContainerDied","Data":"30b8df0973d91b6cf5ac25e5502db9dbe44d6285631679c52bbd4ed5d43e219d"} Oct 08 18:29:40 crc kubenswrapper[4750]: I1008 18:29:40.920205 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6khnd" event={"ID":"7d635eca-619c-4f52-a9d9-73b42d845fbf","Type":"ContainerDied","Data":"0946de31366b62a20d1b790224db8593e76861fae75500292ee8bed2ad56c806"} Oct 08 18:29:40 crc kubenswrapper[4750]: I1008 18:29:40.920227 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0946de31366b62a20d1b790224db8593e76861fae75500292ee8bed2ad56c806" Oct 08 18:29:40 crc kubenswrapper[4750]: I1008 18:29:40.920264 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6khnd" Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.019602 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghc88\" (UniqueName: \"kubernetes.io/projected/7d635eca-619c-4f52-a9d9-73b42d845fbf-kube-api-access-ghc88\") pod \"7d635eca-619c-4f52-a9d9-73b42d845fbf\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.019707 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-combined-ca-bundle\") pod \"7d635eca-619c-4f52-a9d9-73b42d845fbf\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.019776 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-config-data\") pod \"7d635eca-619c-4f52-a9d9-73b42d845fbf\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.019813 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-db-sync-config-data\") pod \"7d635eca-619c-4f52-a9d9-73b42d845fbf\" (UID: \"7d635eca-619c-4f52-a9d9-73b42d845fbf\") " Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.026774 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7d635eca-619c-4f52-a9d9-73b42d845fbf" (UID: "7d635eca-619c-4f52-a9d9-73b42d845fbf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.034099 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d635eca-619c-4f52-a9d9-73b42d845fbf-kube-api-access-ghc88" (OuterVolumeSpecName: "kube-api-access-ghc88") pod "7d635eca-619c-4f52-a9d9-73b42d845fbf" (UID: "7d635eca-619c-4f52-a9d9-73b42d845fbf"). InnerVolumeSpecName "kube-api-access-ghc88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.062062 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d635eca-619c-4f52-a9d9-73b42d845fbf" (UID: "7d635eca-619c-4f52-a9d9-73b42d845fbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.076250 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-config-data" (OuterVolumeSpecName: "config-data") pod "7d635eca-619c-4f52-a9d9-73b42d845fbf" (UID: "7d635eca-619c-4f52-a9d9-73b42d845fbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.121702 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.121740 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.121755 4750 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d635eca-619c-4f52-a9d9-73b42d845fbf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:41 crc kubenswrapper[4750]: I1008 18:29:41.121764 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghc88\" (UniqueName: \"kubernetes.io/projected/7d635eca-619c-4f52-a9d9-73b42d845fbf-kube-api-access-ghc88\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.248026 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-hzlg6"] Oct 08 18:29:42 crc kubenswrapper[4750]: E1008 18:29:42.249095 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d635eca-619c-4f52-a9d9-73b42d845fbf" containerName="glance-db-sync" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.249113 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d635eca-619c-4f52-a9d9-73b42d845fbf" containerName="glance-db-sync" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.249324 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d635eca-619c-4f52-a9d9-73b42d845fbf" containerName="glance-db-sync" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.256516 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.263692 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-hzlg6"] Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.447087 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.447600 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-config\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.447652 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-svc\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.447674 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.447761 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97sxm\" (UniqueName: \"kubernetes.io/projected/47630cbc-a12d-4112-8f89-4f203bdc9649-kube-api-access-97sxm\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.447832 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.549097 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.549203 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-config\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.549245 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-svc\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.549264 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.549279 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97sxm\" (UniqueName: \"kubernetes.io/projected/47630cbc-a12d-4112-8f89-4f203bdc9649-kube-api-access-97sxm\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.549323 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.550362 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.551045 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.551651 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-config\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.552182 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-svc\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.552732 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.572951 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97sxm\" (UniqueName: \"kubernetes.io/projected/47630cbc-a12d-4112-8f89-4f203bdc9649-kube-api-access-97sxm\") pod \"dnsmasq-dns-5dc68bd5-hzlg6\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:42 crc kubenswrapper[4750]: I1008 18:29:42.588832 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.255514 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.258877 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.262276 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.262699 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.263031 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s64q7" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.267046 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.338210 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.340492 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.353198 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.358976 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.363022 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-logs\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.363160 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-scripts\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.363233 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4plq2\" (UniqueName: \"kubernetes.io/projected/59640e5b-309c-4d8e-9a22-d55a95d25326-kube-api-access-4plq2\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.363274 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.363309 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.363334 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-config-data\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.363418 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465164 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-scripts\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465224 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465254 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4plq2\" (UniqueName: \"kubernetes.io/projected/59640e5b-309c-4d8e-9a22-d55a95d25326-kube-api-access-4plq2\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465268 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465292 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465313 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465329 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-config-data\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465345 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqgnw\" (UniqueName: \"kubernetes.io/projected/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-kube-api-access-pqgnw\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465523 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465571 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465609 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465630 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-logs\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465663 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465697 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-logs\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465772 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.465892 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.466028 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-logs\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.469101 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-scripts\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.469934 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.471413 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-config-data\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.484206 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4plq2\" (UniqueName: \"kubernetes.io/projected/59640e5b-309c-4d8e-9a22-d55a95d25326-kube-api-access-4plq2\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.497652 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.567121 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.567358 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqgnw\" (UniqueName: \"kubernetes.io/projected/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-kube-api-access-pqgnw\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.567407 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.567437 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.567456 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-logs\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.567479 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.567537 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.567891 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.568672 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.568803 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-logs\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.572706 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.583512 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.584831 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.588121 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.589821 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqgnw\" (UniqueName: \"kubernetes.io/projected/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-kube-api-access-pqgnw\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.610917 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:29:43 crc kubenswrapper[4750]: I1008 18:29:43.666399 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.140931 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.279136 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-scripts\") pod \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.279307 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-fernet-keys\") pod \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.279391 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzcfk\" (UniqueName: \"kubernetes.io/projected/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-kube-api-access-hzcfk\") pod \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.279423 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-combined-ca-bundle\") pod \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.279473 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-config-data\") pod \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.279490 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-credential-keys\") pod \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\" (UID: \"3ad0cb6f-78d3-48ea-943f-ef07e0b52886\") " Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.284559 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-scripts" (OuterVolumeSpecName: "scripts") pod "3ad0cb6f-78d3-48ea-943f-ef07e0b52886" (UID: "3ad0cb6f-78d3-48ea-943f-ef07e0b52886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.284974 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3ad0cb6f-78d3-48ea-943f-ef07e0b52886" (UID: "3ad0cb6f-78d3-48ea-943f-ef07e0b52886"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.285292 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-kube-api-access-hzcfk" (OuterVolumeSpecName: "kube-api-access-hzcfk") pod "3ad0cb6f-78d3-48ea-943f-ef07e0b52886" (UID: "3ad0cb6f-78d3-48ea-943f-ef07e0b52886"). InnerVolumeSpecName "kube-api-access-hzcfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.285533 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3ad0cb6f-78d3-48ea-943f-ef07e0b52886" (UID: "3ad0cb6f-78d3-48ea-943f-ef07e0b52886"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.309703 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ad0cb6f-78d3-48ea-943f-ef07e0b52886" (UID: "3ad0cb6f-78d3-48ea-943f-ef07e0b52886"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.319334 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-config-data" (OuterVolumeSpecName: "config-data") pod "3ad0cb6f-78d3-48ea-943f-ef07e0b52886" (UID: "3ad0cb6f-78d3-48ea-943f-ef07e0b52886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.381030 4750 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.381068 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzcfk\" (UniqueName: \"kubernetes.io/projected/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-kube-api-access-hzcfk\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.381083 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.381092 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.381101 4750 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.381111 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad0cb6f-78d3-48ea-943f-ef07e0b52886-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.959386 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4fvs" event={"ID":"3ad0cb6f-78d3-48ea-943f-ef07e0b52886","Type":"ContainerDied","Data":"83aca028d9c525e961f00219cf6d2d332229d1b90532db8eced1f51adcbb5867"} Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.959718 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83aca028d9c525e961f00219cf6d2d332229d1b90532db8eced1f51adcbb5867" Oct 08 18:29:44 crc kubenswrapper[4750]: I1008 18:29:44.959790 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4fvs" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.313701 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-548c7c66b4-b72bl"] Oct 08 18:29:45 crc kubenswrapper[4750]: E1008 18:29:45.314206 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad0cb6f-78d3-48ea-943f-ef07e0b52886" containerName="keystone-bootstrap" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.314218 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad0cb6f-78d3-48ea-943f-ef07e0b52886" containerName="keystone-bootstrap" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.314391 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad0cb6f-78d3-48ea-943f-ef07e0b52886" containerName="keystone-bootstrap" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.320526 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.323400 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rx2f7" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.324084 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.324272 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.324436 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.324630 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.324834 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.326746 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-548c7c66b4-b72bl"] Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.398918 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-combined-ca-bundle\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.398968 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-internal-tls-certs\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.398995 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-scripts\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.399020 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-config-data\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.399053 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v95ns\" (UniqueName: \"kubernetes.io/projected/2f22ab58-6189-4321-b660-ed992f6fb70f-kube-api-access-v95ns\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.399082 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-credential-keys\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.399109 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-public-tls-certs\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.399137 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-fernet-keys\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.440792 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.500525 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-internal-tls-certs\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.500602 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-scripts\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.500640 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-config-data\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.500672 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v95ns\" (UniqueName: \"kubernetes.io/projected/2f22ab58-6189-4321-b660-ed992f6fb70f-kube-api-access-v95ns\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.500715 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-credential-keys\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.500737 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-public-tls-certs\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.500780 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-fernet-keys\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.500886 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-combined-ca-bundle\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.506241 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-scripts\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.506945 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-combined-ca-bundle\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.507586 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-fernet-keys\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.507794 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-config-data\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.507975 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-internal-tls-certs\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.511269 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-credential-keys\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.514350 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.522704 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-public-tls-certs\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.523543 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v95ns\" (UniqueName: \"kubernetes.io/projected/2f22ab58-6189-4321-b660-ed992f6fb70f-kube-api-access-v95ns\") pod \"keystone-548c7c66b4-b72bl\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:45 crc kubenswrapper[4750]: I1008 18:29:45.642030 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:57 crc kubenswrapper[4750]: E1008 18:29:57.819673 4750 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f" Oct 08 18:29:57 crc kubenswrapper[4750]: E1008 18:29:57.820296 4750 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dn9hq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-scvbb_openstack(c9e2d08e-75b7-445b-b563-affabf6d8af6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 18:29:57 crc kubenswrapper[4750]: E1008 18:29:57.821462 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-scvbb" podUID="c9e2d08e-75b7-445b-b563-affabf6d8af6" Oct 08 18:29:58 crc kubenswrapper[4750]: E1008 18:29:58.067899 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f\\\"\"" pod="openstack/cinder-db-sync-scvbb" podUID="c9e2d08e-75b7-445b-b563-affabf6d8af6" Oct 08 18:29:58 crc kubenswrapper[4750]: I1008 18:29:58.309099 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-hzlg6"] Oct 08 18:29:58 crc kubenswrapper[4750]: I1008 18:29:58.322597 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-548c7c66b4-b72bl"] Oct 08 18:29:58 crc kubenswrapper[4750]: W1008 18:29:58.323411 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f22ab58_6189_4321_b660_ed992f6fb70f.slice/crio-d16b394fd7bb0c54355f4e32ff41cd24ce068c245ebaa22449979758dc656a7f WatchSource:0}: Error finding container d16b394fd7bb0c54355f4e32ff41cd24ce068c245ebaa22449979758dc656a7f: Status 404 returned error can't find the container with id d16b394fd7bb0c54355f4e32ff41cd24ce068c245ebaa22449979758dc656a7f Oct 08 18:29:58 crc kubenswrapper[4750]: I1008 18:29:58.512434 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:29:58 crc kubenswrapper[4750]: W1008 18:29:58.586380 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59640e5b_309c_4d8e_9a22_d55a95d25326.slice/crio-b3dca44889f986465b261ad5894936ef55e8180b9d9183dd1e7d1a731fac161a WatchSource:0}: Error finding container b3dca44889f986465b261ad5894936ef55e8180b9d9183dd1e7d1a731fac161a: Status 404 returned error can't find the container with id b3dca44889f986465b261ad5894936ef55e8180b9d9183dd1e7d1a731fac161a Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.076685 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ndlps" event={"ID":"aae9b192-27dc-4bad-a9ae-6e03824c59f0","Type":"ContainerStarted","Data":"0a43919de404773932dbdd0d928ccfb9e008e4049e2135f7c1de0c5a8fb787e1"} Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.081111 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerStarted","Data":"cb4fd7025dd8ba93695bc91efa2612277cfde608b5b1db67db82a71242e82d1e"} Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.093024 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-548c7c66b4-b72bl" event={"ID":"2f22ab58-6189-4321-b660-ed992f6fb70f","Type":"ContainerStarted","Data":"e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2"} Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.093054 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-548c7c66b4-b72bl" event={"ID":"2f22ab58-6189-4321-b660-ed992f6fb70f","Type":"ContainerStarted","Data":"d16b394fd7bb0c54355f4e32ff41cd24ce068c245ebaa22449979758dc656a7f"} Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.093149 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.095611 4750 generic.go:334] "Generic (PLEG): container finished" podID="47630cbc-a12d-4112-8f89-4f203bdc9649" containerID="6d2d81f330918bc137f967d02746373a6aaad1b372fab751954c626da8c09f77" exitCode=0 Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.095663 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" event={"ID":"47630cbc-a12d-4112-8f89-4f203bdc9649","Type":"ContainerDied","Data":"6d2d81f330918bc137f967d02746373a6aaad1b372fab751954c626da8c09f77"} Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.095679 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" event={"ID":"47630cbc-a12d-4112-8f89-4f203bdc9649","Type":"ContainerStarted","Data":"5e932747dfe270e08f17fe18c54d358f15690fd2f2dad4cc4df48ce8041b471b"} Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.102356 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ndlps" podStartSLOduration=3.091517736 podStartE2EDuration="24.102340593s" podCreationTimestamp="2025-10-08 18:29:35 +0000 UTC" firstStartedPulling="2025-10-08 18:29:36.783706153 +0000 UTC m=+1132.696677166" lastFinishedPulling="2025-10-08 18:29:57.79452901 +0000 UTC m=+1153.707500023" observedRunningTime="2025-10-08 18:29:59.09817078 +0000 UTC m=+1155.011141783" watchObservedRunningTime="2025-10-08 18:29:59.102340593 +0000 UTC m=+1155.015311606" Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.109259 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rlsgw" event={"ID":"bdfe0b7d-f014-4453-b94c-3842ebdd4052","Type":"ContainerStarted","Data":"69359e54f0c54f8a5c8865533bf2f4afa5a118f2e82303c1851cef48e5f58c91"} Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.115681 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59640e5b-309c-4d8e-9a22-d55a95d25326","Type":"ContainerStarted","Data":"b3dca44889f986465b261ad5894936ef55e8180b9d9183dd1e7d1a731fac161a"} Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.128253 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-548c7c66b4-b72bl" podStartSLOduration=14.128238457 podStartE2EDuration="14.128238457s" podCreationTimestamp="2025-10-08 18:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:29:59.12021193 +0000 UTC m=+1155.033182953" watchObservedRunningTime="2025-10-08 18:29:59.128238457 +0000 UTC m=+1155.041209470" Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.162275 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rlsgw" podStartSLOduration=2.520164323 podStartE2EDuration="39.16225687s" podCreationTimestamp="2025-10-08 18:29:20 +0000 UTC" firstStartedPulling="2025-10-08 18:29:21.248756472 +0000 UTC m=+1117.161727485" lastFinishedPulling="2025-10-08 18:29:57.890849019 +0000 UTC m=+1153.803820032" observedRunningTime="2025-10-08 18:29:59.159926173 +0000 UTC m=+1155.072897186" watchObservedRunningTime="2025-10-08 18:29:59.16225687 +0000 UTC m=+1155.075227883" Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.276195 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:29:59 crc kubenswrapper[4750]: W1008 18:29:59.280762 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6dbed40_0ce9_44de_9bfc_1c83c449f6c1.slice/crio-7f4b23cc1cf7bb3b5c5afac8e848b395c4ab68838d2b3ce6fb6767a0dc5f15fa WatchSource:0}: Error finding container 7f4b23cc1cf7bb3b5c5afac8e848b395c4ab68838d2b3ce6fb6767a0dc5f15fa: Status 404 returned error can't find the container with id 7f4b23cc1cf7bb3b5c5afac8e848b395c4ab68838d2b3ce6fb6767a0dc5f15fa Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.707243 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.707336 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.707386 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.708363 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a27f8518311deef574465704a4c93c21d7cb4e76fec24d95f01a5d7c9febd08d"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:29:59 crc kubenswrapper[4750]: I1008 18:29:59.708451 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://a27f8518311deef574465704a4c93c21d7cb4e76fec24d95f01a5d7c9febd08d" gracePeriod=600 Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.132303 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59640e5b-309c-4d8e-9a22-d55a95d25326","Type":"ContainerStarted","Data":"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140"} Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.132663 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59640e5b-309c-4d8e-9a22-d55a95d25326","Type":"ContainerStarted","Data":"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6"} Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.132458 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerName="glance-httpd" containerID="cri-o://2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6" gracePeriod=30 Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.132371 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerName="glance-log" containerID="cri-o://acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140" gracePeriod=30 Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.155812 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1","Type":"ContainerStarted","Data":"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8"} Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.156081 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1","Type":"ContainerStarted","Data":"7f4b23cc1cf7bb3b5c5afac8e848b395c4ab68838d2b3ce6fb6767a0dc5f15fa"} Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.158075 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl"] Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.159409 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.162717 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.162967 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.175536 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl"] Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.177642 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=18.177620688 podStartE2EDuration="18.177620688s" podCreationTimestamp="2025-10-08 18:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:00.160532638 +0000 UTC m=+1156.073503651" watchObservedRunningTime="2025-10-08 18:30:00.177620688 +0000 UTC m=+1156.090591701" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.180335 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="a27f8518311deef574465704a4c93c21d7cb4e76fec24d95f01a5d7c9febd08d" exitCode=0 Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.180410 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"a27f8518311deef574465704a4c93c21d7cb4e76fec24d95f01a5d7c9febd08d"} Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.180438 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"dabcc140df6267bd6bfe6e96a507eb6f8fc953553e99e9c93846e177880aa4e8"} Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.180454 4750 scope.go:117] "RemoveContainer" containerID="dee2d35a9b3eba166b103d6a720e7cbf72b0876a67fbdc37629a8900d4d09d57" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.186532 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" event={"ID":"47630cbc-a12d-4112-8f89-4f203bdc9649","Type":"ContainerStarted","Data":"ca4029d61ca8c3a69a52b031c6cf5be834790b73d98c1a84ffbda4221e421b3e"} Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.186988 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.227716 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" podStartSLOduration=18.227693984 podStartE2EDuration="18.227693984s" podCreationTimestamp="2025-10-08 18:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:00.213580668 +0000 UTC m=+1156.126551691" watchObservedRunningTime="2025-10-08 18:30:00.227693984 +0000 UTC m=+1156.140665017" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.252156 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4c9a37b-ca76-4dad-bbb1-adb67557d216-secret-volume\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.252464 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/a4c9a37b-ca76-4dad-bbb1-adb67557d216-kube-api-access-ctwl5\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.252665 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4c9a37b-ca76-4dad-bbb1-adb67557d216-config-volume\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.353989 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/a4c9a37b-ca76-4dad-bbb1-adb67557d216-kube-api-access-ctwl5\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.354091 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4c9a37b-ca76-4dad-bbb1-adb67557d216-config-volume\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.354203 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4c9a37b-ca76-4dad-bbb1-adb67557d216-secret-volume\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.355346 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4c9a37b-ca76-4dad-bbb1-adb67557d216-config-volume\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.364380 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4c9a37b-ca76-4dad-bbb1-adb67557d216-secret-volume\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.380667 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/a4c9a37b-ca76-4dad-bbb1-adb67557d216-kube-api-access-ctwl5\") pod \"collect-profiles-29332470-q9tpl\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: E1008 18:30:00.385705 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59640e5b_309c_4d8e_9a22_d55a95d25326.slice/crio-conmon-2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6.scope\": RecentStats: unable to find data in memory cache]" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.487941 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.712359 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.762780 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-scripts\") pod \"59640e5b-309c-4d8e-9a22-d55a95d25326\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.762855 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-httpd-run\") pod \"59640e5b-309c-4d8e-9a22-d55a95d25326\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.762927 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-logs\") pod \"59640e5b-309c-4d8e-9a22-d55a95d25326\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.763070 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"59640e5b-309c-4d8e-9a22-d55a95d25326\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.763149 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-combined-ca-bundle\") pod \"59640e5b-309c-4d8e-9a22-d55a95d25326\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.763258 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4plq2\" (UniqueName: \"kubernetes.io/projected/59640e5b-309c-4d8e-9a22-d55a95d25326-kube-api-access-4plq2\") pod \"59640e5b-309c-4d8e-9a22-d55a95d25326\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.763386 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-config-data\") pod \"59640e5b-309c-4d8e-9a22-d55a95d25326\" (UID: \"59640e5b-309c-4d8e-9a22-d55a95d25326\") " Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.763815 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-logs" (OuterVolumeSpecName: "logs") pod "59640e5b-309c-4d8e-9a22-d55a95d25326" (UID: "59640e5b-309c-4d8e-9a22-d55a95d25326"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.764045 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.765496 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "59640e5b-309c-4d8e-9a22-d55a95d25326" (UID: "59640e5b-309c-4d8e-9a22-d55a95d25326"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.772747 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-scripts" (OuterVolumeSpecName: "scripts") pod "59640e5b-309c-4d8e-9a22-d55a95d25326" (UID: "59640e5b-309c-4d8e-9a22-d55a95d25326"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.772766 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "59640e5b-309c-4d8e-9a22-d55a95d25326" (UID: "59640e5b-309c-4d8e-9a22-d55a95d25326"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.773972 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59640e5b-309c-4d8e-9a22-d55a95d25326-kube-api-access-4plq2" (OuterVolumeSpecName: "kube-api-access-4plq2") pod "59640e5b-309c-4d8e-9a22-d55a95d25326" (UID: "59640e5b-309c-4d8e-9a22-d55a95d25326"). InnerVolumeSpecName "kube-api-access-4plq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.807047 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59640e5b-309c-4d8e-9a22-d55a95d25326" (UID: "59640e5b-309c-4d8e-9a22-d55a95d25326"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.827274 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-config-data" (OuterVolumeSpecName: "config-data") pod "59640e5b-309c-4d8e-9a22-d55a95d25326" (UID: "59640e5b-309c-4d8e-9a22-d55a95d25326"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.865515 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.865563 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59640e5b-309c-4d8e-9a22-d55a95d25326-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.865588 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.865601 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.865616 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4plq2\" (UniqueName: \"kubernetes.io/projected/59640e5b-309c-4d8e-9a22-d55a95d25326-kube-api-access-4plq2\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.865628 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59640e5b-309c-4d8e-9a22-d55a95d25326-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.886928 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.939973 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl"] Oct 08 18:30:00 crc kubenswrapper[4750]: W1008 18:30:00.945133 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c9a37b_ca76_4dad_bbb1_adb67557d216.slice/crio-86827525047e417124d926bf9361c4f6ce16a7770dc6715a1c5a93e64122e677 WatchSource:0}: Error finding container 86827525047e417124d926bf9361c4f6ce16a7770dc6715a1c5a93e64122e677: Status 404 returned error can't find the container with id 86827525047e417124d926bf9361c4f6ce16a7770dc6715a1c5a93e64122e677 Oct 08 18:30:00 crc kubenswrapper[4750]: I1008 18:30:00.967593 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.200925 4750 generic.go:334] "Generic (PLEG): container finished" podID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerID="2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6" exitCode=0 Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.201249 4750 generic.go:334] "Generic (PLEG): container finished" podID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerID="acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140" exitCode=143 Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.200961 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.200978 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59640e5b-309c-4d8e-9a22-d55a95d25326","Type":"ContainerDied","Data":"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6"} Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.201314 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59640e5b-309c-4d8e-9a22-d55a95d25326","Type":"ContainerDied","Data":"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140"} Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.201328 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"59640e5b-309c-4d8e-9a22-d55a95d25326","Type":"ContainerDied","Data":"b3dca44889f986465b261ad5894936ef55e8180b9d9183dd1e7d1a731fac161a"} Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.201350 4750 scope.go:117] "RemoveContainer" containerID="2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.203746 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1","Type":"ContainerStarted","Data":"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0"} Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.203877 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerName="glance-log" containerID="cri-o://63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8" gracePeriod=30 Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.203991 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerName="glance-httpd" containerID="cri-o://8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0" gracePeriod=30 Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.211930 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" event={"ID":"a4c9a37b-ca76-4dad-bbb1-adb67557d216","Type":"ContainerStarted","Data":"1d894b37c5a500471f396c1cbbe6a415d7623a544a977aa562061f0f862047d7"} Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.212000 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" event={"ID":"a4c9a37b-ca76-4dad-bbb1-adb67557d216","Type":"ContainerStarted","Data":"86827525047e417124d926bf9361c4f6ce16a7770dc6715a1c5a93e64122e677"} Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.233530 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=19.233508598 podStartE2EDuration="19.233508598s" podCreationTimestamp="2025-10-08 18:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:01.225224795 +0000 UTC m=+1157.138195818" watchObservedRunningTime="2025-10-08 18:30:01.233508598 +0000 UTC m=+1157.146479611" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.238992 4750 scope.go:117] "RemoveContainer" containerID="acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.259893 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" podStartSLOduration=1.259872724 podStartE2EDuration="1.259872724s" podCreationTimestamp="2025-10-08 18:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:01.240988561 +0000 UTC m=+1157.153959584" watchObservedRunningTime="2025-10-08 18:30:01.259872724 +0000 UTC m=+1157.172843737" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.270593 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.272825 4750 scope.go:117] "RemoveContainer" containerID="2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6" Oct 08 18:30:01 crc kubenswrapper[4750]: E1008 18:30:01.274623 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6\": container with ID starting with 2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6 not found: ID does not exist" containerID="2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.274654 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6"} err="failed to get container status \"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6\": rpc error: code = NotFound desc = could not find container \"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6\": container with ID starting with 2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6 not found: ID does not exist" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.274673 4750 scope.go:117] "RemoveContainer" containerID="acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140" Oct 08 18:30:01 crc kubenswrapper[4750]: E1008 18:30:01.274893 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140\": container with ID starting with acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140 not found: ID does not exist" containerID="acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.274917 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140"} err="failed to get container status \"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140\": rpc error: code = NotFound desc = could not find container \"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140\": container with ID starting with acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140 not found: ID does not exist" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.274931 4750 scope.go:117] "RemoveContainer" containerID="2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.275091 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6"} err="failed to get container status \"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6\": rpc error: code = NotFound desc = could not find container \"2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6\": container with ID starting with 2650d5c896e03d67599c7a033e59b3087d01fd5dfc9d7ddcc272b926b4b6aea6 not found: ID does not exist" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.275104 4750 scope.go:117] "RemoveContainer" containerID="acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.275248 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140"} err="failed to get container status \"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140\": rpc error: code = NotFound desc = could not find container \"acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140\": container with ID starting with acb8a2ed1f530db794637cfa607fe438e6439183a9214bc6c8fa622548ba0140 not found: ID does not exist" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.283903 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.295892 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:01 crc kubenswrapper[4750]: E1008 18:30:01.296292 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerName="glance-log" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.296311 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerName="glance-log" Oct 08 18:30:01 crc kubenswrapper[4750]: E1008 18:30:01.296359 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerName="glance-httpd" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.296369 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerName="glance-httpd" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.296542 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerName="glance-log" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.296582 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" containerName="glance-httpd" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.297446 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.301045 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.301127 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.307430 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.374242 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-scripts\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.374348 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.374401 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.374452 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-logs\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.374569 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.374609 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-config-data\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.374729 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcbv6\" (UniqueName: \"kubernetes.io/projected/4562f82c-cf9e-4b63-bc8a-079a35401232-kube-api-access-mcbv6\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.374860 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.476776 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477103 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-config-data\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477173 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcbv6\" (UniqueName: \"kubernetes.io/projected/4562f82c-cf9e-4b63-bc8a-079a35401232-kube-api-access-mcbv6\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477206 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477245 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-scripts\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477270 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477295 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477318 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-logs\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477324 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.477780 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-logs\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.478301 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.482633 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-scripts\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.482945 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.483831 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.484858 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-config-data\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.493743 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcbv6\" (UniqueName: \"kubernetes.io/projected/4562f82c-cf9e-4b63-bc8a-079a35401232-kube-api-access-mcbv6\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.512737 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.663532 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.724078 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.782014 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-config-data\") pod \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.782493 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.782531 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-combined-ca-bundle\") pod \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.782623 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqgnw\" (UniqueName: \"kubernetes.io/projected/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-kube-api-access-pqgnw\") pod \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.782665 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-logs\") pod \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.782838 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-scripts\") pod \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.782901 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-httpd-run\") pod \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\" (UID: \"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1\") " Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.786451 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" (UID: "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.786663 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-logs" (OuterVolumeSpecName: "logs") pod "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" (UID: "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.789330 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-scripts" (OuterVolumeSpecName: "scripts") pod "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" (UID: "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.796317 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" (UID: "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.814096 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-kube-api-access-pqgnw" (OuterVolumeSpecName: "kube-api-access-pqgnw") pod "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" (UID: "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1"). InnerVolumeSpecName "kube-api-access-pqgnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.823703 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" (UID: "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.864040 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-config-data" (OuterVolumeSpecName: "config-data") pod "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" (UID: "c6dbed40-0ce9-44de-9bfc-1c83c449f6c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.885239 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.885276 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.885314 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.885325 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.885341 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqgnw\" (UniqueName: \"kubernetes.io/projected/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-kube-api-access-pqgnw\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.885352 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.885361 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.912703 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 08 18:30:01 crc kubenswrapper[4750]: I1008 18:30:01.986953 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.207225 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.224514 4750 generic.go:334] "Generic (PLEG): container finished" podID="bdfe0b7d-f014-4453-b94c-3842ebdd4052" containerID="69359e54f0c54f8a5c8865533bf2f4afa5a118f2e82303c1851cef48e5f58c91" exitCode=0 Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.224595 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rlsgw" event={"ID":"bdfe0b7d-f014-4453-b94c-3842ebdd4052","Type":"ContainerDied","Data":"69359e54f0c54f8a5c8865533bf2f4afa5a118f2e82303c1851cef48e5f58c91"} Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.229987 4750 generic.go:334] "Generic (PLEG): container finished" podID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerID="8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0" exitCode=0 Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.230012 4750 generic.go:334] "Generic (PLEG): container finished" podID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerID="63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8" exitCode=143 Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.230195 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.230268 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1","Type":"ContainerDied","Data":"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0"} Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.230300 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1","Type":"ContainerDied","Data":"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8"} Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.230313 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c6dbed40-0ce9-44de-9bfc-1c83c449f6c1","Type":"ContainerDied","Data":"7f4b23cc1cf7bb3b5c5afac8e848b395c4ab68838d2b3ce6fb6767a0dc5f15fa"} Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.230327 4750 scope.go:117] "RemoveContainer" containerID="8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.238222 4750 generic.go:334] "Generic (PLEG): container finished" podID="a4c9a37b-ca76-4dad-bbb1-adb67557d216" containerID="1d894b37c5a500471f396c1cbbe6a415d7623a544a977aa562061f0f862047d7" exitCode=0 Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.238270 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" event={"ID":"a4c9a37b-ca76-4dad-bbb1-adb67557d216","Type":"ContainerDied","Data":"1d894b37c5a500471f396c1cbbe6a415d7623a544a977aa562061f0f862047d7"} Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.304369 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.312226 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.322047 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:02 crc kubenswrapper[4750]: E1008 18:30:02.322427 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerName="glance-httpd" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.322444 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerName="glance-httpd" Oct 08 18:30:02 crc kubenswrapper[4750]: E1008 18:30:02.322462 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerName="glance-log" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.322468 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerName="glance-log" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.322657 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerName="glance-log" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.322672 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" containerName="glance-httpd" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.323573 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.329991 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.342195 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.342374 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.399776 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.399818 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.399855 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksg6k\" (UniqueName: \"kubernetes.io/projected/014b5ab1-7e11-42b6-b4ef-c3dba510298f-kube-api-access-ksg6k\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.399879 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.399902 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.399917 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.399969 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-logs\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.399999 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501297 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501348 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501379 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501455 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-logs\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501489 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501536 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501557 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501604 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksg6k\" (UniqueName: \"kubernetes.io/projected/014b5ab1-7e11-42b6-b4ef-c3dba510298f-kube-api-access-ksg6k\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.501676 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.502181 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-logs\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.502373 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.508265 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.508302 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.510604 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.510754 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.521921 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksg6k\" (UniqueName: \"kubernetes.io/projected/014b5ab1-7e11-42b6-b4ef-c3dba510298f-kube-api-access-ksg6k\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.542717 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.655737 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.746373 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59640e5b-309c-4d8e-9a22-d55a95d25326" path="/var/lib/kubelet/pods/59640e5b-309c-4d8e-9a22-d55a95d25326/volumes" Oct 08 18:30:02 crc kubenswrapper[4750]: I1008 18:30:02.747321 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6dbed40-0ce9-44de-9bfc-1c83c449f6c1" path="/var/lib/kubelet/pods/c6dbed40-0ce9-44de-9bfc-1c83c449f6c1/volumes" Oct 08 18:30:03 crc kubenswrapper[4750]: I1008 18:30:03.254615 4750 generic.go:334] "Generic (PLEG): container finished" podID="aae9b192-27dc-4bad-a9ae-6e03824c59f0" containerID="0a43919de404773932dbdd0d928ccfb9e008e4049e2135f7c1de0c5a8fb787e1" exitCode=0 Oct 08 18:30:03 crc kubenswrapper[4750]: I1008 18:30:03.254924 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ndlps" event={"ID":"aae9b192-27dc-4bad-a9ae-6e03824c59f0","Type":"ContainerDied","Data":"0a43919de404773932dbdd0d928ccfb9e008e4049e2135f7c1de0c5a8fb787e1"} Oct 08 18:30:04 crc kubenswrapper[4750]: W1008 18:30:04.804423 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4562f82c_cf9e_4b63_bc8a_079a35401232.slice/crio-ffa5cfecc1d3ff393e180d97588f990caaf57a734c75baf7e08d86acd23002d7 WatchSource:0}: Error finding container ffa5cfecc1d3ff393e180d97588f990caaf57a734c75baf7e08d86acd23002d7: Status 404 returned error can't find the container with id ffa5cfecc1d3ff393e180d97588f990caaf57a734c75baf7e08d86acd23002d7 Oct 08 18:30:04 crc kubenswrapper[4750]: I1008 18:30:04.890143 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:04 crc kubenswrapper[4750]: I1008 18:30:04.900008 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rlsgw" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.049936 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-scripts\") pod \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.050003 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4c9a37b-ca76-4dad-bbb1-adb67557d216-config-volume\") pod \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.050076 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-config-data\") pod \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.050125 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfe0b7d-f014-4453-b94c-3842ebdd4052-logs\") pod \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.050226 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4c9a37b-ca76-4dad-bbb1-adb67557d216-secret-volume\") pod \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.050256 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-combined-ca-bundle\") pod \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.050307 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cnv7\" (UniqueName: \"kubernetes.io/projected/bdfe0b7d-f014-4453-b94c-3842ebdd4052-kube-api-access-4cnv7\") pod \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\" (UID: \"bdfe0b7d-f014-4453-b94c-3842ebdd4052\") " Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.050357 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/a4c9a37b-ca76-4dad-bbb1-adb67557d216-kube-api-access-ctwl5\") pod \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\" (UID: \"a4c9a37b-ca76-4dad-bbb1-adb67557d216\") " Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.051883 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdfe0b7d-f014-4453-b94c-3842ebdd4052-logs" (OuterVolumeSpecName: "logs") pod "bdfe0b7d-f014-4453-b94c-3842ebdd4052" (UID: "bdfe0b7d-f014-4453-b94c-3842ebdd4052"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.052325 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c9a37b-ca76-4dad-bbb1-adb67557d216-config-volume" (OuterVolumeSpecName: "config-volume") pod "a4c9a37b-ca76-4dad-bbb1-adb67557d216" (UID: "a4c9a37b-ca76-4dad-bbb1-adb67557d216"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.056871 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdfe0b7d-f014-4453-b94c-3842ebdd4052-kube-api-access-4cnv7" (OuterVolumeSpecName: "kube-api-access-4cnv7") pod "bdfe0b7d-f014-4453-b94c-3842ebdd4052" (UID: "bdfe0b7d-f014-4453-b94c-3842ebdd4052"). InnerVolumeSpecName "kube-api-access-4cnv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.058015 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c9a37b-ca76-4dad-bbb1-adb67557d216-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a4c9a37b-ca76-4dad-bbb1-adb67557d216" (UID: "a4c9a37b-ca76-4dad-bbb1-adb67557d216"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.058076 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c9a37b-ca76-4dad-bbb1-adb67557d216-kube-api-access-ctwl5" (OuterVolumeSpecName: "kube-api-access-ctwl5") pod "a4c9a37b-ca76-4dad-bbb1-adb67557d216" (UID: "a4c9a37b-ca76-4dad-bbb1-adb67557d216"). InnerVolumeSpecName "kube-api-access-ctwl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.059108 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-scripts" (OuterVolumeSpecName: "scripts") pod "bdfe0b7d-f014-4453-b94c-3842ebdd4052" (UID: "bdfe0b7d-f014-4453-b94c-3842ebdd4052"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.082653 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdfe0b7d-f014-4453-b94c-3842ebdd4052" (UID: "bdfe0b7d-f014-4453-b94c-3842ebdd4052"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.084803 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-config-data" (OuterVolumeSpecName: "config-data") pod "bdfe0b7d-f014-4453-b94c-3842ebdd4052" (UID: "bdfe0b7d-f014-4453-b94c-3842ebdd4052"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.152708 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4c9a37b-ca76-4dad-bbb1-adb67557d216-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.152743 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.152754 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cnv7\" (UniqueName: \"kubernetes.io/projected/bdfe0b7d-f014-4453-b94c-3842ebdd4052-kube-api-access-4cnv7\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.152763 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/a4c9a37b-ca76-4dad-bbb1-adb67557d216-kube-api-access-ctwl5\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.152771 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.152779 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4c9a37b-ca76-4dad-bbb1-adb67557d216-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.152788 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfe0b7d-f014-4453-b94c-3842ebdd4052-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.152797 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfe0b7d-f014-4453-b94c-3842ebdd4052-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.276629 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" event={"ID":"a4c9a37b-ca76-4dad-bbb1-adb67557d216","Type":"ContainerDied","Data":"86827525047e417124d926bf9361c4f6ce16a7770dc6715a1c5a93e64122e677"} Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.276635 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.276761 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86827525047e417124d926bf9361c4f6ce16a7770dc6715a1c5a93e64122e677" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.292135 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4562f82c-cf9e-4b63-bc8a-079a35401232","Type":"ContainerStarted","Data":"ffa5cfecc1d3ff393e180d97588f990caaf57a734c75baf7e08d86acd23002d7"} Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.298035 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rlsgw" event={"ID":"bdfe0b7d-f014-4453-b94c-3842ebdd4052","Type":"ContainerDied","Data":"ca9acb8a2135375602fc4f60a280f3534a9e91d88f140b9f0f01249cf88e0178"} Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.298079 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9acb8a2135375602fc4f60a280f3534a9e91d88f140b9f0f01249cf88e0178" Oct 08 18:30:05 crc kubenswrapper[4750]: I1008 18:30:05.298082 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rlsgw" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.020021 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-788b97745d-6snpn"] Oct 08 18:30:06 crc kubenswrapper[4750]: E1008 18:30:06.020721 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfe0b7d-f014-4453-b94c-3842ebdd4052" containerName="placement-db-sync" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.020738 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfe0b7d-f014-4453-b94c-3842ebdd4052" containerName="placement-db-sync" Oct 08 18:30:06 crc kubenswrapper[4750]: E1008 18:30:06.020751 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c9a37b-ca76-4dad-bbb1-adb67557d216" containerName="collect-profiles" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.020759 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c9a37b-ca76-4dad-bbb1-adb67557d216" containerName="collect-profiles" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.021009 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdfe0b7d-f014-4453-b94c-3842ebdd4052" containerName="placement-db-sync" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.021029 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c9a37b-ca76-4dad-bbb1-adb67557d216" containerName="collect-profiles" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.022150 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.025662 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.025864 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8swdj" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.025999 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.026111 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.031807 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-788b97745d-6snpn"] Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.032491 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.172001 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-logs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.172060 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-internal-tls-certs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.172081 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-public-tls-certs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.172097 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-scripts\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.172499 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5r2g\" (UniqueName: \"kubernetes.io/projected/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-kube-api-access-c5r2g\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.172602 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-config-data\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.172710 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-combined-ca-bundle\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.274634 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5r2g\" (UniqueName: \"kubernetes.io/projected/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-kube-api-access-c5r2g\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.274680 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-config-data\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.274719 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-combined-ca-bundle\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.274771 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-logs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.274805 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-internal-tls-certs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.275238 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-logs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.274825 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-public-tls-certs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.275759 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-scripts\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.279108 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-combined-ca-bundle\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.279388 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-internal-tls-certs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.281504 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-config-data\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.286995 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-scripts\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.290322 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-public-tls-certs\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.294195 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5r2g\" (UniqueName: \"kubernetes.io/projected/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-kube-api-access-c5r2g\") pod \"placement-788b97745d-6snpn\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.362974 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:06 crc kubenswrapper[4750]: I1008 18:30:06.974732 4750 scope.go:117] "RemoveContainer" containerID="63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.069267 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ndlps" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.188427 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-db-sync-config-data\") pod \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.188500 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xfcg\" (UniqueName: \"kubernetes.io/projected/aae9b192-27dc-4bad-a9ae-6e03824c59f0-kube-api-access-5xfcg\") pod \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.188681 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-combined-ca-bundle\") pod \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\" (UID: \"aae9b192-27dc-4bad-a9ae-6e03824c59f0\") " Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.194883 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aae9b192-27dc-4bad-a9ae-6e03824c59f0" (UID: "aae9b192-27dc-4bad-a9ae-6e03824c59f0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.196625 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae9b192-27dc-4bad-a9ae-6e03824c59f0-kube-api-access-5xfcg" (OuterVolumeSpecName: "kube-api-access-5xfcg") pod "aae9b192-27dc-4bad-a9ae-6e03824c59f0" (UID: "aae9b192-27dc-4bad-a9ae-6e03824c59f0"). InnerVolumeSpecName "kube-api-access-5xfcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.216847 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aae9b192-27dc-4bad-a9ae-6e03824c59f0" (UID: "aae9b192-27dc-4bad-a9ae-6e03824c59f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.291387 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.291431 4750 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aae9b192-27dc-4bad-a9ae-6e03824c59f0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.291445 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xfcg\" (UniqueName: \"kubernetes.io/projected/aae9b192-27dc-4bad-a9ae-6e03824c59f0-kube-api-access-5xfcg\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.329633 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ndlps" event={"ID":"aae9b192-27dc-4bad-a9ae-6e03824c59f0","Type":"ContainerDied","Data":"5798dfb85a096f205a57a70bb1a79ca1dad98561cc11adcb3425f5efb115bbed"} Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.329668 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5798dfb85a096f205a57a70bb1a79ca1dad98561cc11adcb3425f5efb115bbed" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.330743 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ndlps" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.505187 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:07 crc kubenswrapper[4750]: W1008 18:30:07.547762 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod014b5ab1_7e11_42b6_b4ef_c3dba510298f.slice/crio-5ae3cd11d3ed6737715a8fd72e49a3fbcc66817aa8d3bfbb2acdc67dfea4dfd2 WatchSource:0}: Error finding container 5ae3cd11d3ed6737715a8fd72e49a3fbcc66817aa8d3bfbb2acdc67dfea4dfd2: Status 404 returned error can't find the container with id 5ae3cd11d3ed6737715a8fd72e49a3fbcc66817aa8d3bfbb2acdc67dfea4dfd2 Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.590147 4750 scope.go:117] "RemoveContainer" containerID="8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0" Oct 08 18:30:07 crc kubenswrapper[4750]: E1008 18:30:07.590986 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0\": container with ID starting with 8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0 not found: ID does not exist" containerID="8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.591020 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0"} err="failed to get container status \"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0\": rpc error: code = NotFound desc = could not find container \"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0\": container with ID starting with 8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0 not found: ID does not exist" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.591044 4750 scope.go:117] "RemoveContainer" containerID="63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.591114 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:30:07 crc kubenswrapper[4750]: E1008 18:30:07.592369 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8\": container with ID starting with 63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8 not found: ID does not exist" containerID="63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.592396 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8"} err="failed to get container status \"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8\": rpc error: code = NotFound desc = could not find container \"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8\": container with ID starting with 63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8 not found: ID does not exist" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.592418 4750 scope.go:117] "RemoveContainer" containerID="8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.593542 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0"} err="failed to get container status \"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0\": rpc error: code = NotFound desc = could not find container \"8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0\": container with ID starting with 8e6287d7d2f3f6c939b56ccdb779c3a600534b4433ac0f5c9ca8961e2a55fdd0 not found: ID does not exist" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.593644 4750 scope.go:117] "RemoveContainer" containerID="63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.595281 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8"} err="failed to get container status \"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8\": rpc error: code = NotFound desc = could not find container \"63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8\": container with ID starting with 63566589d72c9635e0836a0ad15cee4e9587d1b1cd0ae81824f64803e8b595f8 not found: ID does not exist" Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.656788 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-4vw9x"] Oct 08 18:30:07 crc kubenswrapper[4750]: I1008 18:30:07.657287 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" podUID="6964604e-3015-4e12-b64e-83b733af5806" containerName="dnsmasq-dns" containerID="cri-o://279cc2214cdfe5a50ef21d37a8076ed4f7cba2124d492e32458479978f538a7a" gracePeriod=10 Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.135694 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-788b97745d-6snpn"] Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.340281 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"014b5ab1-7e11-42b6-b4ef-c3dba510298f","Type":"ContainerStarted","Data":"5ae3cd11d3ed6737715a8fd72e49a3fbcc66817aa8d3bfbb2acdc67dfea4dfd2"} Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.341013 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788b97745d-6snpn" event={"ID":"ec1950dc-6caf-45f7-9b18-8c12db1b3f25","Type":"ContainerStarted","Data":"5d1d700aa26055d00df0ee9d9c502e69727e064e2128f09053507cb709ba3f90"} Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.385359 4750 generic.go:334] "Generic (PLEG): container finished" podID="6964604e-3015-4e12-b64e-83b733af5806" containerID="279cc2214cdfe5a50ef21d37a8076ed4f7cba2124d492e32458479978f538a7a" exitCode=0 Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.385420 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" event={"ID":"6964604e-3015-4e12-b64e-83b733af5806","Type":"ContainerDied","Data":"279cc2214cdfe5a50ef21d37a8076ed4f7cba2124d492e32458479978f538a7a"} Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.509094 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-689cf77786-nkzv6"] Oct 08 18:30:08 crc kubenswrapper[4750]: E1008 18:30:08.509588 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae9b192-27dc-4bad-a9ae-6e03824c59f0" containerName="barbican-db-sync" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.509613 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae9b192-27dc-4bad-a9ae-6e03824c59f0" containerName="barbican-db-sync" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.509841 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae9b192-27dc-4bad-a9ae-6e03824c59f0" containerName="barbican-db-sync" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.510942 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.526495 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-689cf77786-nkzv6"] Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.527025 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.527075 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n4lv8" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.527027 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.544787 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-587d5f7b59-ws4tc"] Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.546419 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.555984 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.629689 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-587d5f7b59-ws4tc"] Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630063 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-combined-ca-bundle\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630093 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvgx9\" (UniqueName: \"kubernetes.io/projected/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-kube-api-access-tvgx9\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630113 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630143 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data-custom\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630175 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8b4s\" (UniqueName: \"kubernetes.io/projected/28550569-4c3c-48cf-a621-eddec0919b51-kube-api-access-l8b4s\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630211 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-combined-ca-bundle\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630363 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630381 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28550569-4c3c-48cf-a621-eddec0919b51-logs\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630427 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-logs\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.630446 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data-custom\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.686662 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9cb888f-9b4vd"] Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.688178 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.703876 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9cb888f-9b4vd"] Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731295 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-combined-ca-bundle\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731340 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krj7h\" (UniqueName: \"kubernetes.io/projected/1b533003-3aca-4347-b38f-ee183857019f-kube-api-access-krj7h\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731371 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731391 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731408 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-swift-storage-0\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731423 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28550569-4c3c-48cf-a621-eddec0919b51-logs\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731440 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-svc\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731458 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-logs\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731475 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data-custom\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731518 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731634 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-combined-ca-bundle\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731659 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-config\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731716 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvgx9\" (UniqueName: \"kubernetes.io/projected/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-kube-api-access-tvgx9\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731734 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731784 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data-custom\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.731816 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8b4s\" (UniqueName: \"kubernetes.io/projected/28550569-4c3c-48cf-a621-eddec0919b51-kube-api-access-l8b4s\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.732992 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28550569-4c3c-48cf-a621-eddec0919b51-logs\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.733321 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-logs\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.740153 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-combined-ca-bundle\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.740203 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data-custom\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.742681 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.745562 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.754737 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-combined-ca-bundle\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.764016 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data-custom\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.776370 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvgx9\" (UniqueName: \"kubernetes.io/projected/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-kube-api-access-tvgx9\") pod \"barbican-keystone-listener-689cf77786-nkzv6\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.800439 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8b4s\" (UniqueName: \"kubernetes.io/projected/28550569-4c3c-48cf-a621-eddec0919b51-kube-api-access-l8b4s\") pod \"barbican-worker-587d5f7b59-ws4tc\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.826337 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.832862 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.832963 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-config\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.833087 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krj7h\" (UniqueName: \"kubernetes.io/projected/1b533003-3aca-4347-b38f-ee183857019f-kube-api-access-krj7h\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.833114 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.833157 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-swift-storage-0\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.833184 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-svc\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.834333 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-svc\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.834911 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.835499 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-config\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.836319 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.836394 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-swift-storage-0\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.859907 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krj7h\" (UniqueName: \"kubernetes.io/projected/1b533003-3aca-4347-b38f-ee183857019f-kube-api-access-krj7h\") pod \"dnsmasq-dns-54f9cb888f-9b4vd\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.896378 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.923017 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84b5b4bdc4-tgxnj"] Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.924655 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.928759 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.936037 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84b5b4bdc4-tgxnj"] Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.938756 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2sv\" (UniqueName: \"kubernetes.io/projected/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-kube-api-access-8b2sv\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.938792 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-combined-ca-bundle\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.938884 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data-custom\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.938908 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:08 crc kubenswrapper[4750]: I1008 18:30:08.938928 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-logs\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.030438 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.040797 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data-custom\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.041207 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.041245 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-logs\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.041330 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2sv\" (UniqueName: \"kubernetes.io/projected/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-kube-api-access-8b2sv\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.041351 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-combined-ca-bundle\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.043836 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-logs\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.047596 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-combined-ca-bundle\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.052315 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.058209 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data-custom\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.059530 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2sv\" (UniqueName: \"kubernetes.io/projected/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-kube-api-access-8b2sv\") pod \"barbican-api-84b5b4bdc4-tgxnj\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.242616 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.247342 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-config\") pod \"6964604e-3015-4e12-b64e-83b733af5806\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.247413 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-swift-storage-0\") pod \"6964604e-3015-4e12-b64e-83b733af5806\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.247469 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-nb\") pod \"6964604e-3015-4e12-b64e-83b733af5806\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.247495 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d524l\" (UniqueName: \"kubernetes.io/projected/6964604e-3015-4e12-b64e-83b733af5806-kube-api-access-d524l\") pod \"6964604e-3015-4e12-b64e-83b733af5806\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.247550 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-sb\") pod \"6964604e-3015-4e12-b64e-83b733af5806\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.247609 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-svc\") pod \"6964604e-3015-4e12-b64e-83b733af5806\" (UID: \"6964604e-3015-4e12-b64e-83b733af5806\") " Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.266837 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.332755 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6964604e-3015-4e12-b64e-83b733af5806-kube-api-access-d524l" (OuterVolumeSpecName: "kube-api-access-d524l") pod "6964604e-3015-4e12-b64e-83b733af5806" (UID: "6964604e-3015-4e12-b64e-83b733af5806"). InnerVolumeSpecName "kube-api-access-d524l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.349999 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d524l\" (UniqueName: \"kubernetes.io/projected/6964604e-3015-4e12-b64e-83b733af5806-kube-api-access-d524l\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.417881 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788b97745d-6snpn" event={"ID":"ec1950dc-6caf-45f7-9b18-8c12db1b3f25","Type":"ContainerStarted","Data":"945094b64faf05a196a97d662a4c3e8b64a2ebe7a2311be55ef9549fcca90849"} Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.421739 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" event={"ID":"6964604e-3015-4e12-b64e-83b733af5806","Type":"ContainerDied","Data":"2da2c22e424740b35f06edb3ecb57422e1e1fbb04927860662c02ad8a0f3003b"} Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.421776 4750 scope.go:117] "RemoveContainer" containerID="279cc2214cdfe5a50ef21d37a8076ed4f7cba2124d492e32458479978f538a7a" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.421889 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d96c67b5-4vw9x" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.429148 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4562f82c-cf9e-4b63-bc8a-079a35401232","Type":"ContainerStarted","Data":"07f77d60fd9c23dd677f1ff78fefb6c3cf307b23e20e0edb4ff4bf60ed6b2d0c"} Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.462212 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"014b5ab1-7e11-42b6-b4ef-c3dba510298f","Type":"ContainerStarted","Data":"bfd9719615835450521b7511f861f62cb2bb85abf40f24ff9d817e7a2d193db4"} Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.474674 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-689cf77786-nkzv6"] Oct 08 18:30:09 crc kubenswrapper[4750]: W1008 18:30:09.511120 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ecf0d73_0ca5_4124_93fb_348f8769c2e2.slice/crio-1b2e6a6e7dab77f2b8e9865e8ccf080c1670a2e60175da1776664031b1acd2b1 WatchSource:0}: Error finding container 1b2e6a6e7dab77f2b8e9865e8ccf080c1670a2e60175da1776664031b1acd2b1: Status 404 returned error can't find the container with id 1b2e6a6e7dab77f2b8e9865e8ccf080c1670a2e60175da1776664031b1acd2b1 Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.623987 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-587d5f7b59-ws4tc"] Oct 08 18:30:09 crc kubenswrapper[4750]: W1008 18:30:09.632962 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28550569_4c3c_48cf_a621_eddec0919b51.slice/crio-308514aa8ce1ab4cdccf60c938d0902546a0771305aa9481bc5a2695b056de2f WatchSource:0}: Error finding container 308514aa8ce1ab4cdccf60c938d0902546a0771305aa9481bc5a2695b056de2f: Status 404 returned error can't find the container with id 308514aa8ce1ab4cdccf60c938d0902546a0771305aa9481bc5a2695b056de2f Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.649621 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6964604e-3015-4e12-b64e-83b733af5806" (UID: "6964604e-3015-4e12-b64e-83b733af5806"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.664022 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.671973 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6964604e-3015-4e12-b64e-83b733af5806" (UID: "6964604e-3015-4e12-b64e-83b733af5806"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.681145 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6964604e-3015-4e12-b64e-83b733af5806" (UID: "6964604e-3015-4e12-b64e-83b733af5806"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.681805 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-config" (OuterVolumeSpecName: "config") pod "6964604e-3015-4e12-b64e-83b733af5806" (UID: "6964604e-3015-4e12-b64e-83b733af5806"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.691269 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6964604e-3015-4e12-b64e-83b733af5806" (UID: "6964604e-3015-4e12-b64e-83b733af5806"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.721011 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9cb888f-9b4vd"] Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.766388 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.766418 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.766429 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.766439 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964604e-3015-4e12-b64e-83b733af5806-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.793666 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84b5b4bdc4-tgxnj"] Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.836957 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-4vw9x"] Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.844417 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d96c67b5-4vw9x"] Oct 08 18:30:09 crc kubenswrapper[4750]: I1008 18:30:09.848948 4750 scope.go:117] "RemoveContainer" containerID="211393a548b77a9fa2cc9169548d12ee0b3ad4a628304edd0df0cbecc8987904" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.495308 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" event={"ID":"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15","Type":"ContainerStarted","Data":"84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.495875 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" event={"ID":"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15","Type":"ContainerStarted","Data":"4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.495892 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" event={"ID":"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15","Type":"ContainerStarted","Data":"0e0b72be0fa7bcd6c46bf3aca3ede9f21bd6a744612403cb0ad79ecf20b53e42"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.497163 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.497198 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.501082 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4562f82c-cf9e-4b63-bc8a-079a35401232","Type":"ContainerStarted","Data":"38270156d6423a8c5864c7bc1267940a2d7348e6f8289504cdf3ae28b7a1663d"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.504070 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerStarted","Data":"d42c941b59852316ae7c9b15a016afa287735faf08d424f634bb90d6cfe6273d"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.504234 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="ceilometer-central-agent" containerID="cri-o://c1307d892a277faff4013cf95858ca39bd78853297e2048184e8f1c187e0201b" gracePeriod=30 Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.504794 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="sg-core" containerID="cri-o://cb4fd7025dd8ba93695bc91efa2612277cfde608b5b1db67db82a71242e82d1e" gracePeriod=30 Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.504828 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="ceilometer-notification-agent" containerID="cri-o://32e882a4c502a439a479c8ea6d79da722ef53840fad55da94fdcab4bd196ec16" gracePeriod=30 Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.504920 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="proxy-httpd" containerID="cri-o://d42c941b59852316ae7c9b15a016afa287735faf08d424f634bb90d6cfe6273d" gracePeriod=30 Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.505072 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.509305 4750 generic.go:334] "Generic (PLEG): container finished" podID="41b4bf15-850b-4082-8053-aee61b75dc58" containerID="e0eb306813cd83380792baf93d28642815dbbaa7ce0d625844aac67c665b8844" exitCode=0 Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.509374 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dd8dn" event={"ID":"41b4bf15-850b-4082-8053-aee61b75dc58","Type":"ContainerDied","Data":"e0eb306813cd83380792baf93d28642815dbbaa7ce0d625844aac67c665b8844"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.525632 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"014b5ab1-7e11-42b6-b4ef-c3dba510298f","Type":"ContainerStarted","Data":"2bebcd680b17c03745fe629ed9bbe0c732bfc09be9c86b9e75d9c81f409a018a"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.528939 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" podStartSLOduration=2.528917873 podStartE2EDuration="2.528917873s" podCreationTimestamp="2025-10-08 18:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:10.512455779 +0000 UTC m=+1166.425426802" watchObservedRunningTime="2025-10-08 18:30:10.528917873 +0000 UTC m=+1166.441888906" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.538141 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" event={"ID":"9ecf0d73-0ca5-4124-93fb-348f8769c2e2","Type":"ContainerStarted","Data":"1b2e6a6e7dab77f2b8e9865e8ccf080c1670a2e60175da1776664031b1acd2b1"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.542707 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-587d5f7b59-ws4tc" event={"ID":"28550569-4c3c-48cf-a621-eddec0919b51","Type":"ContainerStarted","Data":"308514aa8ce1ab4cdccf60c938d0902546a0771305aa9481bc5a2695b056de2f"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.557792 4750 generic.go:334] "Generic (PLEG): container finished" podID="1b533003-3aca-4347-b38f-ee183857019f" containerID="bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc" exitCode=0 Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.557852 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" event={"ID":"1b533003-3aca-4347-b38f-ee183857019f","Type":"ContainerDied","Data":"bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.557878 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" event={"ID":"1b533003-3aca-4347-b38f-ee183857019f","Type":"ContainerStarted","Data":"c12dcea495964304c3935fce76e1957fef6e4b0fa893c78ed7d5c88d10bb609d"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.562647 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788b97745d-6snpn" event={"ID":"ec1950dc-6caf-45f7-9b18-8c12db1b3f25","Type":"ContainerStarted","Data":"f46e6c1f5784d1ac90a72cce013512954360b873208916edc44bc5574486392e"} Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.562904 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.563030 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.635124 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.5944974849999998 podStartE2EDuration="51.635105945s" podCreationTimestamp="2025-10-08 18:29:19 +0000 UTC" firstStartedPulling="2025-10-08 18:29:21.062093459 +0000 UTC m=+1116.975064472" lastFinishedPulling="2025-10-08 18:30:09.102701919 +0000 UTC m=+1165.015672932" observedRunningTime="2025-10-08 18:30:10.60143415 +0000 UTC m=+1166.514405153" watchObservedRunningTime="2025-10-08 18:30:10.635105945 +0000 UTC m=+1166.548076958" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.643041 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.643018838 podStartE2EDuration="9.643018838s" podCreationTimestamp="2025-10-08 18:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:10.636826657 +0000 UTC m=+1166.549797690" watchObservedRunningTime="2025-10-08 18:30:10.643018838 +0000 UTC m=+1166.555989851" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.677114 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.677095704 podStartE2EDuration="8.677095704s" podCreationTimestamp="2025-10-08 18:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:10.667773175 +0000 UTC m=+1166.580744198" watchObservedRunningTime="2025-10-08 18:30:10.677095704 +0000 UTC m=+1166.590066717" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.707594 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-788b97745d-6snpn" podStartSLOduration=5.70757853 podStartE2EDuration="5.70757853s" podCreationTimestamp="2025-10-08 18:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:10.704676869 +0000 UTC m=+1166.617647892" watchObservedRunningTime="2025-10-08 18:30:10.70757853 +0000 UTC m=+1166.620549543" Oct 08 18:30:10 crc kubenswrapper[4750]: I1008 18:30:10.748628 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6964604e-3015-4e12-b64e-83b733af5806" path="/var/lib/kubelet/pods/6964604e-3015-4e12-b64e-83b733af5806/volumes" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.574690 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" event={"ID":"1b533003-3aca-4347-b38f-ee183857019f","Type":"ContainerStarted","Data":"9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175"} Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.574981 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.578465 4750 generic.go:334] "Generic (PLEG): container finished" podID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerID="d42c941b59852316ae7c9b15a016afa287735faf08d424f634bb90d6cfe6273d" exitCode=0 Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.578501 4750 generic.go:334] "Generic (PLEG): container finished" podID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerID="cb4fd7025dd8ba93695bc91efa2612277cfde608b5b1db67db82a71242e82d1e" exitCode=2 Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.578516 4750 generic.go:334] "Generic (PLEG): container finished" podID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerID="32e882a4c502a439a479c8ea6d79da722ef53840fad55da94fdcab4bd196ec16" exitCode=0 Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.578527 4750 generic.go:334] "Generic (PLEG): container finished" podID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerID="c1307d892a277faff4013cf95858ca39bd78853297e2048184e8f1c187e0201b" exitCode=0 Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.578528 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerDied","Data":"d42c941b59852316ae7c9b15a016afa287735faf08d424f634bb90d6cfe6273d"} Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.578722 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerDied","Data":"cb4fd7025dd8ba93695bc91efa2612277cfde608b5b1db67db82a71242e82d1e"} Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.578747 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerDied","Data":"32e882a4c502a439a479c8ea6d79da722ef53840fad55da94fdcab4bd196ec16"} Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.578760 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerDied","Data":"c1307d892a277faff4013cf95858ca39bd78853297e2048184e8f1c187e0201b"} Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.617535 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" podStartSLOduration=3.617517764 podStartE2EDuration="3.617517764s" podCreationTimestamp="2025-10-08 18:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:11.612455081 +0000 UTC m=+1167.525426094" watchObservedRunningTime="2025-10-08 18:30:11.617517764 +0000 UTC m=+1167.530488767" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.664756 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.664797 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.708441 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.748106 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.807081 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-594d9fc688-28msd"] Oct 08 18:30:11 crc kubenswrapper[4750]: E1008 18:30:11.807471 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6964604e-3015-4e12-b64e-83b733af5806" containerName="init" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.807489 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6964604e-3015-4e12-b64e-83b733af5806" containerName="init" Oct 08 18:30:11 crc kubenswrapper[4750]: E1008 18:30:11.807529 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6964604e-3015-4e12-b64e-83b733af5806" containerName="dnsmasq-dns" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.807537 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6964604e-3015-4e12-b64e-83b733af5806" containerName="dnsmasq-dns" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.807711 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6964604e-3015-4e12-b64e-83b733af5806" containerName="dnsmasq-dns" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.808791 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.812951 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.813164 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.821203 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-594d9fc688-28msd"] Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.921499 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-public-tls-certs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.921563 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxv7n\" (UniqueName: \"kubernetes.io/projected/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-kube-api-access-hxv7n\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.921586 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.921631 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-combined-ca-bundle\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.921647 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-internal-tls-certs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.921675 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-logs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.926775 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data-custom\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:11 crc kubenswrapper[4750]: I1008 18:30:11.976411 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.028992 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxv7n\" (UniqueName: \"kubernetes.io/projected/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-kube-api-access-hxv7n\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.029042 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.029109 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-internal-tls-certs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.029132 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-combined-ca-bundle\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.029176 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-logs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.029262 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data-custom\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.029344 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-public-tls-certs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.030297 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-logs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.033678 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-public-tls-certs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.036018 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data-custom\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.036982 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.038204 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-internal-tls-certs\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.049519 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxv7n\" (UniqueName: \"kubernetes.io/projected/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-kube-api-access-hxv7n\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.050117 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.051994 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-combined-ca-bundle\") pod \"barbican-api-594d9fc688-28msd\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.130540 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-sg-core-conf-yaml\") pod \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.130683 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-combined-ca-bundle\") pod \"41b4bf15-850b-4082-8053-aee61b75dc58\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.130765 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-config-data\") pod \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.130896 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m6wx\" (UniqueName: \"kubernetes.io/projected/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-kube-api-access-5m6wx\") pod \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.130961 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-run-httpd\") pod \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.130989 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-config\") pod \"41b4bf15-850b-4082-8053-aee61b75dc58\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.131028 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x72l\" (UniqueName: \"kubernetes.io/projected/41b4bf15-850b-4082-8053-aee61b75dc58-kube-api-access-6x72l\") pod \"41b4bf15-850b-4082-8053-aee61b75dc58\" (UID: \"41b4bf15-850b-4082-8053-aee61b75dc58\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.131066 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-scripts\") pod \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.131085 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-combined-ca-bundle\") pod \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.131112 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-log-httpd\") pod \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\" (UID: \"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a\") " Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.131454 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" (UID: "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.131648 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.131918 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" (UID: "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.140665 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-kube-api-access-5m6wx" (OuterVolumeSpecName: "kube-api-access-5m6wx") pod "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" (UID: "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a"). InnerVolumeSpecName "kube-api-access-5m6wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.152307 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-scripts" (OuterVolumeSpecName: "scripts") pod "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" (UID: "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.153312 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b4bf15-850b-4082-8053-aee61b75dc58-kube-api-access-6x72l" (OuterVolumeSpecName: "kube-api-access-6x72l") pod "41b4bf15-850b-4082-8053-aee61b75dc58" (UID: "41b4bf15-850b-4082-8053-aee61b75dc58"). InnerVolumeSpecName "kube-api-access-6x72l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.180056 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.200177 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41b4bf15-850b-4082-8053-aee61b75dc58" (UID: "41b4bf15-850b-4082-8053-aee61b75dc58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.215131 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" (UID: "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.230092 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-config" (OuterVolumeSpecName: "config") pod "41b4bf15-850b-4082-8053-aee61b75dc58" (UID: "41b4bf15-850b-4082-8053-aee61b75dc58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.234380 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m6wx\" (UniqueName: \"kubernetes.io/projected/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-kube-api-access-5m6wx\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.234423 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.234435 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x72l\" (UniqueName: \"kubernetes.io/projected/41b4bf15-850b-4082-8053-aee61b75dc58-kube-api-access-6x72l\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.234445 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.234457 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.234468 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.234479 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b4bf15-850b-4082-8053-aee61b75dc58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.281463 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" (UID: "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.297150 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-config-data" (OuterVolumeSpecName: "config-data") pod "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" (UID: "5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.336307 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.336349 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.588451 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" event={"ID":"9ecf0d73-0ca5-4124-93fb-348f8769c2e2","Type":"ContainerStarted","Data":"b88fc77d21b840724590332b8c3c0a46a81b3b88285cff5ea88ef4c54f55690c"} Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.588771 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" event={"ID":"9ecf0d73-0ca5-4124-93fb-348f8769c2e2","Type":"ContainerStarted","Data":"f8c62ba9b4feac7a06f75a6ce1e507df508740e668c07ed3f788844322a15de7"} Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.592039 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-587d5f7b59-ws4tc" event={"ID":"28550569-4c3c-48cf-a621-eddec0919b51","Type":"ContainerStarted","Data":"1a151318fef849fd5d62d1d54d979f4d4b38f1111627d51da48a87b928387362"} Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.592183 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-587d5f7b59-ws4tc" event={"ID":"28550569-4c3c-48cf-a621-eddec0919b51","Type":"ContainerStarted","Data":"3dac308349d8e3040066022cf2273a5c40cf9dfc274269fe30212fe125c4aad8"} Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.594274 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a","Type":"ContainerDied","Data":"3988604e7459185f046c114054933e191f3654a1e3fd4ba326cda80726ed88d7"} Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.594371 4750 scope.go:117] "RemoveContainer" containerID="d42c941b59852316ae7c9b15a016afa287735faf08d424f634bb90d6cfe6273d" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.594279 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.595808 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dd8dn" event={"ID":"41b4bf15-850b-4082-8053-aee61b75dc58","Type":"ContainerDied","Data":"beacea0cf8e43c114ef7cdb955f49ed9c5dd283d74693b64b725bf9f77656ab2"} Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.595945 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beacea0cf8e43c114ef7cdb955f49ed9c5dd283d74693b64b725bf9f77656ab2" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.596361 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.596462 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dd8dn" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.596523 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.614087 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" podStartSLOduration=2.493086625 podStartE2EDuration="4.614068801s" podCreationTimestamp="2025-10-08 18:30:08 +0000 UTC" firstStartedPulling="2025-10-08 18:30:09.529837034 +0000 UTC m=+1165.442808047" lastFinishedPulling="2025-10-08 18:30:11.65081921 +0000 UTC m=+1167.563790223" observedRunningTime="2025-10-08 18:30:12.607932251 +0000 UTC m=+1168.520903264" watchObservedRunningTime="2025-10-08 18:30:12.614068801 +0000 UTC m=+1168.527039804" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.623608 4750 scope.go:117] "RemoveContainer" containerID="cb4fd7025dd8ba93695bc91efa2612277cfde608b5b1db67db82a71242e82d1e" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.648849 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-587d5f7b59-ws4tc" podStartSLOduration=2.651398234 podStartE2EDuration="4.648824233s" podCreationTimestamp="2025-10-08 18:30:08 +0000 UTC" firstStartedPulling="2025-10-08 18:30:09.659933652 +0000 UTC m=+1165.572904665" lastFinishedPulling="2025-10-08 18:30:11.657359651 +0000 UTC m=+1167.570330664" observedRunningTime="2025-10-08 18:30:12.630931444 +0000 UTC m=+1168.543902477" watchObservedRunningTime="2025-10-08 18:30:12.648824233 +0000 UTC m=+1168.561795266" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.650288 4750 scope.go:117] "RemoveContainer" containerID="32e882a4c502a439a479c8ea6d79da722ef53840fad55da94fdcab4bd196ec16" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.657667 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.658577 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.670136 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.685642 4750 scope.go:117] "RemoveContainer" containerID="c1307d892a277faff4013cf95858ca39bd78853297e2048184e8f1c187e0201b" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.686252 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.720745 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:12 crc kubenswrapper[4750]: E1008 18:30:12.721127 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="ceilometer-central-agent" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721138 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="ceilometer-central-agent" Oct 08 18:30:12 crc kubenswrapper[4750]: E1008 18:30:12.721157 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b4bf15-850b-4082-8053-aee61b75dc58" containerName="neutron-db-sync" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721164 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b4bf15-850b-4082-8053-aee61b75dc58" containerName="neutron-db-sync" Oct 08 18:30:12 crc kubenswrapper[4750]: E1008 18:30:12.721177 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="proxy-httpd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721183 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="proxy-httpd" Oct 08 18:30:12 crc kubenswrapper[4750]: E1008 18:30:12.721195 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="sg-core" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721201 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="sg-core" Oct 08 18:30:12 crc kubenswrapper[4750]: E1008 18:30:12.721222 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="ceilometer-notification-agent" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721228 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="ceilometer-notification-agent" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721399 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="proxy-httpd" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721411 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="sg-core" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721422 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b4bf15-850b-4082-8053-aee61b75dc58" containerName="neutron-db-sync" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721433 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="ceilometer-notification-agent" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.721444 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" containerName="ceilometer-central-agent" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.723214 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.732010 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.732225 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.757444 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a" path="/var/lib/kubelet/pods/5ddbbe8f-20e9-4f77-a3da-ecec1f700a5a/volumes" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.758185 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.758217 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-594d9fc688-28msd"] Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.789049 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.799870 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9cb888f-9b4vd"] Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.803536 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:12 crc kubenswrapper[4750]: W1008 18:30:12.808693 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e0647ee_68b0_4b8b_9bf0_066dc7a274d0.slice/crio-117039dd02728f4fc08e14b3a5a43fb419d8337ae8cb024f43567ad225a8798b WatchSource:0}: Error finding container 117039dd02728f4fc08e14b3a5a43fb419d8337ae8cb024f43567ad225a8798b: Status 404 returned error can't find the container with id 117039dd02728f4fc08e14b3a5a43fb419d8337ae8cb024f43567ad225a8798b Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.844534 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.844655 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-scripts\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.844726 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-log-httpd\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.844742 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-config-data\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.844757 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgbr\" (UniqueName: \"kubernetes.io/projected/b09ec256-29a4-497c-a23c-eef4c9cd47c8-kube-api-access-ddgbr\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.844774 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.844800 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-run-httpd\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.847671 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-xttcf"] Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.849054 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.875947 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-xttcf"] Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.946989 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947071 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-sb\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947112 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-svc\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947161 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-scripts\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947187 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-swift-storage-0\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947265 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-config\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947304 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-log-httpd\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947319 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-config-data\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947336 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgbr\" (UniqueName: \"kubernetes.io/projected/b09ec256-29a4-497c-a23c-eef4c9cd47c8-kube-api-access-ddgbr\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947380 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947404 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-run-httpd\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947422 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9qgn\" (UniqueName: \"kubernetes.io/projected/5a44a1f2-be71-4ae0-b115-9ca979c931be-kube-api-access-g9qgn\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947445 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-nb\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.947934 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-log-httpd\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.948428 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-run-httpd\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.955348 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.956027 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-config-data\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.956678 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-scripts\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.957527 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:12 crc kubenswrapper[4750]: I1008 18:30:12.975297 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgbr\" (UniqueName: \"kubernetes.io/projected/b09ec256-29a4-497c-a23c-eef4c9cd47c8-kube-api-access-ddgbr\") pod \"ceilometer-0\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " pod="openstack/ceilometer-0" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.024025 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c5ddbcd5b-g9rd8"] Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.025646 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.029440 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-j2bkb" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.029456 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.029721 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.029831 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.040490 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c5ddbcd5b-g9rd8"] Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.049381 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-sb\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.049467 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-svc\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.049513 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-swift-storage-0\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.049571 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-config\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.049624 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9qgn\" (UniqueName: \"kubernetes.io/projected/5a44a1f2-be71-4ae0-b115-9ca979c931be-kube-api-access-g9qgn\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.049696 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-nb\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.051029 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-nb\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.051042 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-swift-storage-0\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.051714 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-sb\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.051725 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-config\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.052221 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-svc\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.072872 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.105696 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9qgn\" (UniqueName: \"kubernetes.io/projected/5a44a1f2-be71-4ae0-b115-9ca979c931be-kube-api-access-g9qgn\") pod \"dnsmasq-dns-5c78787df7-xttcf\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.150973 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-combined-ca-bundle\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.151032 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-ovndb-tls-certs\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.151059 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-config\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.151127 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-httpd-config\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.151149 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqct\" (UniqueName: \"kubernetes.io/projected/42135b66-6641-4c85-9958-ae210a3de33f-kube-api-access-jdqct\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.187056 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.290848 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-combined-ca-bundle\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.290920 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-ovndb-tls-certs\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.290953 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-config\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.291053 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-httpd-config\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.291084 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqct\" (UniqueName: \"kubernetes.io/projected/42135b66-6641-4c85-9958-ae210a3de33f-kube-api-access-jdqct\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.308388 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-httpd-config\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.319783 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqct\" (UniqueName: \"kubernetes.io/projected/42135b66-6641-4c85-9958-ae210a3de33f-kube-api-access-jdqct\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.325342 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-combined-ca-bundle\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.331417 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-config\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.356432 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-ovndb-tls-certs\") pod \"neutron-5c5ddbcd5b-g9rd8\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.366073 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.611366 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-594d9fc688-28msd" event={"ID":"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0","Type":"ContainerStarted","Data":"653d2c90232b23be484b8fc7b55378350753180183fba74f1dbd9d6ac1dde268"} Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.611404 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" podUID="1b533003-3aca-4347-b38f-ee183857019f" containerName="dnsmasq-dns" containerID="cri-o://9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175" gracePeriod=10 Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.615449 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.615481 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.615491 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-594d9fc688-28msd" event={"ID":"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0","Type":"ContainerStarted","Data":"117039dd02728f4fc08e14b3a5a43fb419d8337ae8cb024f43567ad225a8798b"} Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.657111 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:13 crc kubenswrapper[4750]: I1008 18:30:13.854644 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-xttcf"] Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.137141 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c5ddbcd5b-g9rd8"] Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.157657 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.222501 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-swift-storage-0\") pod \"1b533003-3aca-4347-b38f-ee183857019f\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.222623 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-sb\") pod \"1b533003-3aca-4347-b38f-ee183857019f\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.222676 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-nb\") pod \"1b533003-3aca-4347-b38f-ee183857019f\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.223037 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-config\") pod \"1b533003-3aca-4347-b38f-ee183857019f\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.223069 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-svc\") pod \"1b533003-3aca-4347-b38f-ee183857019f\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.223103 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krj7h\" (UniqueName: \"kubernetes.io/projected/1b533003-3aca-4347-b38f-ee183857019f-kube-api-access-krj7h\") pod \"1b533003-3aca-4347-b38f-ee183857019f\" (UID: \"1b533003-3aca-4347-b38f-ee183857019f\") " Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.247308 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b533003-3aca-4347-b38f-ee183857019f-kube-api-access-krj7h" (OuterVolumeSpecName: "kube-api-access-krj7h") pod "1b533003-3aca-4347-b38f-ee183857019f" (UID: "1b533003-3aca-4347-b38f-ee183857019f"). InnerVolumeSpecName "kube-api-access-krj7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.325216 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krj7h\" (UniqueName: \"kubernetes.io/projected/1b533003-3aca-4347-b38f-ee183857019f-kube-api-access-krj7h\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.327569 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b533003-3aca-4347-b38f-ee183857019f" (UID: "1b533003-3aca-4347-b38f-ee183857019f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.368289 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-config" (OuterVolumeSpecName: "config") pod "1b533003-3aca-4347-b38f-ee183857019f" (UID: "1b533003-3aca-4347-b38f-ee183857019f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.374192 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b533003-3aca-4347-b38f-ee183857019f" (UID: "1b533003-3aca-4347-b38f-ee183857019f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.377667 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b533003-3aca-4347-b38f-ee183857019f" (UID: "1b533003-3aca-4347-b38f-ee183857019f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.396279 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b533003-3aca-4347-b38f-ee183857019f" (UID: "1b533003-3aca-4347-b38f-ee183857019f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.427153 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.427194 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.427209 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.427220 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.427232 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b533003-3aca-4347-b38f-ee183857019f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.624020 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c5ddbcd5b-g9rd8" event={"ID":"42135b66-6641-4c85-9958-ae210a3de33f","Type":"ContainerStarted","Data":"98d46e3019cc140d2edfd1ae1ba45f4848c71cbcd5aae8bb050ca125efcaa4bc"} Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.626310 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-594d9fc688-28msd" event={"ID":"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0","Type":"ContainerStarted","Data":"2d2ea6e9b74498814dcf6fb556ced530c433e9440f72f77d7890312774245a65"} Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.626511 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.626734 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.629704 4750 generic.go:334] "Generic (PLEG): container finished" podID="1b533003-3aca-4347-b38f-ee183857019f" containerID="9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175" exitCode=0 Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.629768 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.629798 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" event={"ID":"1b533003-3aca-4347-b38f-ee183857019f","Type":"ContainerDied","Data":"9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175"} Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.629834 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9cb888f-9b4vd" event={"ID":"1b533003-3aca-4347-b38f-ee183857019f","Type":"ContainerDied","Data":"c12dcea495964304c3935fce76e1957fef6e4b0fa893c78ed7d5c88d10bb609d"} Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.629863 4750 scope.go:117] "RemoveContainer" containerID="9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.632451 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" event={"ID":"5a44a1f2-be71-4ae0-b115-9ca979c931be","Type":"ContainerStarted","Data":"679d39226c41d5cc084ef441560d140a170510ddedb7cca364928eecf7577857"} Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.632501 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" event={"ID":"5a44a1f2-be71-4ae0-b115-9ca979c931be","Type":"ContainerStarted","Data":"865dbac6ce37b74ef33df3db129d708788d0b7dfac77f48e171969df671b386b"} Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.636325 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerStarted","Data":"a0add41a6aef45c34e9eaf41b006c76f23e1bee22bfbeb15b9ec05de1bc1146d"} Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.650078 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-594d9fc688-28msd" podStartSLOduration=3.6500587749999998 podStartE2EDuration="3.650058775s" podCreationTimestamp="2025-10-08 18:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:14.649394169 +0000 UTC m=+1170.562365192" watchObservedRunningTime="2025-10-08 18:30:14.650058775 +0000 UTC m=+1170.563029788" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.659771 4750 scope.go:117] "RemoveContainer" containerID="bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.676617 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9cb888f-9b4vd"] Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.681540 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9cb888f-9b4vd"] Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.699704 4750 scope.go:117] "RemoveContainer" containerID="9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175" Oct 08 18:30:14 crc kubenswrapper[4750]: E1008 18:30:14.702420 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175\": container with ID starting with 9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175 not found: ID does not exist" containerID="9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.702456 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175"} err="failed to get container status \"9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175\": rpc error: code = NotFound desc = could not find container \"9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175\": container with ID starting with 9b9a79cc163461c513b6babd7c8c7f2494ff1089febcdf1d14fb0472e402a175 not found: ID does not exist" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.702480 4750 scope.go:117] "RemoveContainer" containerID="bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc" Oct 08 18:30:14 crc kubenswrapper[4750]: E1008 18:30:14.705634 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc\": container with ID starting with bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc not found: ID does not exist" containerID="bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.705688 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc"} err="failed to get container status \"bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc\": rpc error: code = NotFound desc = could not find container \"bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc\": container with ID starting with bda12012b9b4bed602522bfff191ca33bb01854c2d43e3f749b8f43cf6ebcddc not found: ID does not exist" Oct 08 18:30:14 crc kubenswrapper[4750]: I1008 18:30:14.827117 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b533003-3aca-4347-b38f-ee183857019f" path="/var/lib/kubelet/pods/1b533003-3aca-4347-b38f-ee183857019f/volumes" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.606527 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8d8895f6c-zszml"] Oct 08 18:30:15 crc kubenswrapper[4750]: E1008 18:30:15.607289 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b533003-3aca-4347-b38f-ee183857019f" containerName="dnsmasq-dns" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.607306 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b533003-3aca-4347-b38f-ee183857019f" containerName="dnsmasq-dns" Oct 08 18:30:15 crc kubenswrapper[4750]: E1008 18:30:15.607322 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b533003-3aca-4347-b38f-ee183857019f" containerName="init" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.607330 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b533003-3aca-4347-b38f-ee183857019f" containerName="init" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.607644 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b533003-3aca-4347-b38f-ee183857019f" containerName="dnsmasq-dns" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.608713 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.610462 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.614091 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.630402 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8d8895f6c-zszml"] Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.650933 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-public-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.650978 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-config\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.650997 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-internal-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.651032 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-httpd-config\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.651061 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-combined-ca-bundle\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.651137 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-ovndb-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.651160 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm7dn\" (UniqueName: \"kubernetes.io/projected/7b751e62-8a05-413c-9f82-e9f28230e5ba-kube-api-access-vm7dn\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.688367 4750 generic.go:334] "Generic (PLEG): container finished" podID="5a44a1f2-be71-4ae0-b115-9ca979c931be" containerID="679d39226c41d5cc084ef441560d140a170510ddedb7cca364928eecf7577857" exitCode=0 Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.688628 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" event={"ID":"5a44a1f2-be71-4ae0-b115-9ca979c931be","Type":"ContainerDied","Data":"679d39226c41d5cc084ef441560d140a170510ddedb7cca364928eecf7577857"} Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.696823 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerStarted","Data":"a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa"} Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.698485 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c5ddbcd5b-g9rd8" event={"ID":"42135b66-6641-4c85-9958-ae210a3de33f","Type":"ContainerStarted","Data":"52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6"} Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.698516 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c5ddbcd5b-g9rd8" event={"ID":"42135b66-6641-4c85-9958-ae210a3de33f","Type":"ContainerStarted","Data":"eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723"} Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.699222 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.711610 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-scvbb" event={"ID":"c9e2d08e-75b7-445b-b563-affabf6d8af6","Type":"ContainerStarted","Data":"615f108be73c8808e6f727e994a17ccb8fea35ef950e043806543c45a2c404b7"} Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.711626 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.763366 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm7dn\" (UniqueName: \"kubernetes.io/projected/7b751e62-8a05-413c-9f82-e9f28230e5ba-kube-api-access-vm7dn\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.763584 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-public-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.763645 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-config\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.763678 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-internal-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.763820 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-httpd-config\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.764105 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-combined-ca-bundle\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.765227 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-ovndb-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.791812 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-public-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.821412 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-combined-ca-bundle\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.833630 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-config\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.839399 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-ovndb-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.858804 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm7dn\" (UniqueName: \"kubernetes.io/projected/7b751e62-8a05-413c-9f82-e9f28230e5ba-kube-api-access-vm7dn\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.871200 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-internal-tls-certs\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.872337 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-httpd-config\") pod \"neutron-8d8895f6c-zszml\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.931197 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.945943 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-scvbb" podStartSLOduration=4.081101552 podStartE2EDuration="40.945915955s" podCreationTimestamp="2025-10-08 18:29:35 +0000 UTC" firstStartedPulling="2025-10-08 18:29:36.485315463 +0000 UTC m=+1132.398286476" lastFinishedPulling="2025-10-08 18:30:13.350129866 +0000 UTC m=+1169.263100879" observedRunningTime="2025-10-08 18:30:15.792164688 +0000 UTC m=+1171.705135701" watchObservedRunningTime="2025-10-08 18:30:15.945915955 +0000 UTC m=+1171.858886978" Oct 08 18:30:15 crc kubenswrapper[4750]: I1008 18:30:15.993992 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c5ddbcd5b-g9rd8" podStartSLOduration=3.9939725619999997 podStartE2EDuration="3.993972562s" podCreationTimestamp="2025-10-08 18:30:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:15.876049663 +0000 UTC m=+1171.789020676" watchObservedRunningTime="2025-10-08 18:30:15.993972562 +0000 UTC m=+1171.906943575" Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.639955 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8d8895f6c-zszml"] Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.759254 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d8895f6c-zszml" event={"ID":"7b751e62-8a05-413c-9f82-e9f28230e5ba","Type":"ContainerStarted","Data":"d3aee10f5eb2213bed14729101e10990fa8a38feb2dfe34d59f150028a3cb3ae"} Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.768027 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" event={"ID":"5a44a1f2-be71-4ae0-b115-9ca979c931be","Type":"ContainerStarted","Data":"636c0fd3299ccd708561f50ef31954da172919d0ab9e81edc53a06d0698f61da"} Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.769171 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.784338 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerStarted","Data":"449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f"} Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.799455 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" podStartSLOduration=4.799430706 podStartE2EDuration="4.799430706s" podCreationTimestamp="2025-10-08 18:30:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:16.787095334 +0000 UTC m=+1172.700066357" watchObservedRunningTime="2025-10-08 18:30:16.799430706 +0000 UTC m=+1172.712401719" Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.945343 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.945757 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 18:30:16 crc kubenswrapper[4750]: I1008 18:30:16.976371 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 18:30:17 crc kubenswrapper[4750]: I1008 18:30:17.196603 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:17 crc kubenswrapper[4750]: I1008 18:30:17.793163 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerStarted","Data":"7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99"} Oct 08 18:30:17 crc kubenswrapper[4750]: I1008 18:30:17.795325 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d8895f6c-zszml" event={"ID":"7b751e62-8a05-413c-9f82-e9f28230e5ba","Type":"ContainerStarted","Data":"c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2"} Oct 08 18:30:17 crc kubenswrapper[4750]: I1008 18:30:17.795368 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d8895f6c-zszml" event={"ID":"7b751e62-8a05-413c-9f82-e9f28230e5ba","Type":"ContainerStarted","Data":"f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9"} Oct 08 18:30:18 crc kubenswrapper[4750]: I1008 18:30:18.344445 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:30:18 crc kubenswrapper[4750]: I1008 18:30:18.367053 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8d8895f6c-zszml" podStartSLOduration=3.367031964 podStartE2EDuration="3.367031964s" podCreationTimestamp="2025-10-08 18:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:17.824116962 +0000 UTC m=+1173.737087995" watchObservedRunningTime="2025-10-08 18:30:18.367031964 +0000 UTC m=+1174.280002987" Oct 08 18:30:18 crc kubenswrapper[4750]: I1008 18:30:18.803605 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:19 crc kubenswrapper[4750]: I1008 18:30:19.449662 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 18:30:19 crc kubenswrapper[4750]: I1008 18:30:19.824182 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerStarted","Data":"643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4"} Oct 08 18:30:19 crc kubenswrapper[4750]: I1008 18:30:19.824246 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 18:30:19 crc kubenswrapper[4750]: I1008 18:30:19.843479 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.454558364 podStartE2EDuration="7.843465628s" podCreationTimestamp="2025-10-08 18:30:12 +0000 UTC" firstStartedPulling="2025-10-08 18:30:13.715073886 +0000 UTC m=+1169.628044899" lastFinishedPulling="2025-10-08 18:30:19.10398115 +0000 UTC m=+1175.016952163" observedRunningTime="2025-10-08 18:30:19.840487115 +0000 UTC m=+1175.753458128" watchObservedRunningTime="2025-10-08 18:30:19.843465628 +0000 UTC m=+1175.756436641" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.143695 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.144855 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.152416 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.153171 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.153441 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.153574 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5d8h5" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.254562 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbfft\" (UniqueName: \"kubernetes.io/projected/845a93a1-bf5f-4820-a580-e01d2ed59416-kube-api-access-rbfft\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.254607 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.254695 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config-secret\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.254975 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-combined-ca-bundle\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.356626 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config-secret\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.357400 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-combined-ca-bundle\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.357455 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.357470 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbfft\" (UniqueName: \"kubernetes.io/projected/845a93a1-bf5f-4820-a580-e01d2ed59416-kube-api-access-rbfft\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.358326 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.373430 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config-secret\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.376193 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-combined-ca-bundle\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.383182 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbfft\" (UniqueName: \"kubernetes.io/projected/845a93a1-bf5f-4820-a580-e01d2ed59416-kube-api-access-rbfft\") pod \"openstackclient\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " pod="openstack/openstackclient" Oct 08 18:30:20 crc kubenswrapper[4750]: I1008 18:30:20.458311 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 18:30:21 crc kubenswrapper[4750]: I1008 18:30:21.011949 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 18:30:21 crc kubenswrapper[4750]: W1008 18:30:21.020696 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod845a93a1_bf5f_4820_a580_e01d2ed59416.slice/crio-eaf7f56b50a7dd66f324a7556703d12f9d47b1153b0c174bda59e9d136e1013e WatchSource:0}: Error finding container eaf7f56b50a7dd66f324a7556703d12f9d47b1153b0c174bda59e9d136e1013e: Status 404 returned error can't find the container with id eaf7f56b50a7dd66f324a7556703d12f9d47b1153b0c174bda59e9d136e1013e Oct 08 18:30:21 crc kubenswrapper[4750]: I1008 18:30:21.130883 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:21 crc kubenswrapper[4750]: I1008 18:30:21.140917 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:21 crc kubenswrapper[4750]: I1008 18:30:21.853618 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"845a93a1-bf5f-4820-a580-e01d2ed59416","Type":"ContainerStarted","Data":"eaf7f56b50a7dd66f324a7556703d12f9d47b1153b0c174bda59e9d136e1013e"} Oct 08 18:30:22 crc kubenswrapper[4750]: I1008 18:30:22.866726 4750 generic.go:334] "Generic (PLEG): container finished" podID="c9e2d08e-75b7-445b-b563-affabf6d8af6" containerID="615f108be73c8808e6f727e994a17ccb8fea35ef950e043806543c45a2c404b7" exitCode=0 Oct 08 18:30:22 crc kubenswrapper[4750]: I1008 18:30:22.866806 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-scvbb" event={"ID":"c9e2d08e-75b7-445b-b563-affabf6d8af6","Type":"ContainerDied","Data":"615f108be73c8808e6f727e994a17ccb8fea35ef950e043806543c45a2c404b7"} Oct 08 18:30:23 crc kubenswrapper[4750]: I1008 18:30:23.188745 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:23 crc kubenswrapper[4750]: I1008 18:30:23.249057 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-hzlg6"] Oct 08 18:30:23 crc kubenswrapper[4750]: I1008 18:30:23.249290 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" podUID="47630cbc-a12d-4112-8f89-4f203bdc9649" containerName="dnsmasq-dns" containerID="cri-o://ca4029d61ca8c3a69a52b031c6cf5be834790b73d98c1a84ffbda4221e421b3e" gracePeriod=10 Oct 08 18:30:23 crc kubenswrapper[4750]: I1008 18:30:23.884276 4750 generic.go:334] "Generic (PLEG): container finished" podID="47630cbc-a12d-4112-8f89-4f203bdc9649" containerID="ca4029d61ca8c3a69a52b031c6cf5be834790b73d98c1a84ffbda4221e421b3e" exitCode=0 Oct 08 18:30:23 crc kubenswrapper[4750]: I1008 18:30:23.884513 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" event={"ID":"47630cbc-a12d-4112-8f89-4f203bdc9649","Type":"ContainerDied","Data":"ca4029d61ca8c3a69a52b031c6cf5be834790b73d98c1a84ffbda4221e421b3e"} Oct 08 18:30:23 crc kubenswrapper[4750]: I1008 18:30:23.884732 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" event={"ID":"47630cbc-a12d-4112-8f89-4f203bdc9649","Type":"ContainerDied","Data":"5e932747dfe270e08f17fe18c54d358f15690fd2f2dad4cc4df48ce8041b471b"} Oct 08 18:30:23 crc kubenswrapper[4750]: I1008 18:30:23.884752 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e932747dfe270e08f17fe18c54d358f15690fd2f2dad4cc4df48ce8041b471b" Oct 08 18:30:23 crc kubenswrapper[4750]: I1008 18:30:23.894865 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.038699 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-swift-storage-0\") pod \"47630cbc-a12d-4112-8f89-4f203bdc9649\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.038856 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-svc\") pod \"47630cbc-a12d-4112-8f89-4f203bdc9649\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.038895 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97sxm\" (UniqueName: \"kubernetes.io/projected/47630cbc-a12d-4112-8f89-4f203bdc9649-kube-api-access-97sxm\") pod \"47630cbc-a12d-4112-8f89-4f203bdc9649\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.038916 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-nb\") pod \"47630cbc-a12d-4112-8f89-4f203bdc9649\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.038952 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-sb\") pod \"47630cbc-a12d-4112-8f89-4f203bdc9649\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.038969 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-config\") pod \"47630cbc-a12d-4112-8f89-4f203bdc9649\" (UID: \"47630cbc-a12d-4112-8f89-4f203bdc9649\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.054722 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47630cbc-a12d-4112-8f89-4f203bdc9649-kube-api-access-97sxm" (OuterVolumeSpecName: "kube-api-access-97sxm") pod "47630cbc-a12d-4112-8f89-4f203bdc9649" (UID: "47630cbc-a12d-4112-8f89-4f203bdc9649"). InnerVolumeSpecName "kube-api-access-97sxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.122024 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-config" (OuterVolumeSpecName: "config") pod "47630cbc-a12d-4112-8f89-4f203bdc9649" (UID: "47630cbc-a12d-4112-8f89-4f203bdc9649"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.124210 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "47630cbc-a12d-4112-8f89-4f203bdc9649" (UID: "47630cbc-a12d-4112-8f89-4f203bdc9649"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.143581 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "47630cbc-a12d-4112-8f89-4f203bdc9649" (UID: "47630cbc-a12d-4112-8f89-4f203bdc9649"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.144237 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.144266 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.144283 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97sxm\" (UniqueName: \"kubernetes.io/projected/47630cbc-a12d-4112-8f89-4f203bdc9649-kube-api-access-97sxm\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.144296 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.177941 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "47630cbc-a12d-4112-8f89-4f203bdc9649" (UID: "47630cbc-a12d-4112-8f89-4f203bdc9649"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.217702 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "47630cbc-a12d-4112-8f89-4f203bdc9649" (UID: "47630cbc-a12d-4112-8f89-4f203bdc9649"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.247780 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-scvbb" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.248529 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.248656 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47630cbc-a12d-4112-8f89-4f203bdc9649-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.349826 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-config-data\") pod \"c9e2d08e-75b7-445b-b563-affabf6d8af6\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.349932 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-db-sync-config-data\") pod \"c9e2d08e-75b7-445b-b563-affabf6d8af6\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.349997 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e2d08e-75b7-445b-b563-affabf6d8af6-etc-machine-id\") pod \"c9e2d08e-75b7-445b-b563-affabf6d8af6\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.350029 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-scripts\") pod \"c9e2d08e-75b7-445b-b563-affabf6d8af6\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.350053 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn9hq\" (UniqueName: \"kubernetes.io/projected/c9e2d08e-75b7-445b-b563-affabf6d8af6-kube-api-access-dn9hq\") pod \"c9e2d08e-75b7-445b-b563-affabf6d8af6\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.350102 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-combined-ca-bundle\") pod \"c9e2d08e-75b7-445b-b563-affabf6d8af6\" (UID: \"c9e2d08e-75b7-445b-b563-affabf6d8af6\") " Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.350297 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9e2d08e-75b7-445b-b563-affabf6d8af6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c9e2d08e-75b7-445b-b563-affabf6d8af6" (UID: "c9e2d08e-75b7-445b-b563-affabf6d8af6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.350700 4750 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e2d08e-75b7-445b-b563-affabf6d8af6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.355088 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c9e2d08e-75b7-445b-b563-affabf6d8af6" (UID: "c9e2d08e-75b7-445b-b563-affabf6d8af6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.361249 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-scripts" (OuterVolumeSpecName: "scripts") pod "c9e2d08e-75b7-445b-b563-affabf6d8af6" (UID: "c9e2d08e-75b7-445b-b563-affabf6d8af6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.371358 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e2d08e-75b7-445b-b563-affabf6d8af6-kube-api-access-dn9hq" (OuterVolumeSpecName: "kube-api-access-dn9hq") pod "c9e2d08e-75b7-445b-b563-affabf6d8af6" (UID: "c9e2d08e-75b7-445b-b563-affabf6d8af6"). InnerVolumeSpecName "kube-api-access-dn9hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.377323 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.414706 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e2d08e-75b7-445b-b563-affabf6d8af6" (UID: "c9e2d08e-75b7-445b-b563-affabf6d8af6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.426618 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-config-data" (OuterVolumeSpecName: "config-data") pod "c9e2d08e-75b7-445b-b563-affabf6d8af6" (UID: "c9e2d08e-75b7-445b-b563-affabf6d8af6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.452072 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.452105 4750 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.452114 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.452230 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn9hq\" (UniqueName: \"kubernetes.io/projected/c9e2d08e-75b7-445b-b563-affabf6d8af6-kube-api-access-dn9hq\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.452275 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e2d08e-75b7-445b-b563-affabf6d8af6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.765372 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.873450 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84b5b4bdc4-tgxnj"] Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.873743 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerName="barbican-api-log" containerID="cri-o://4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467" gracePeriod=30 Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.873920 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerName="barbican-api" containerID="cri-o://84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf" gracePeriod=30 Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.933967 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-scvbb" event={"ID":"c9e2d08e-75b7-445b-b563-affabf6d8af6","Type":"ContainerDied","Data":"452a29d9d307706cab1d8c8b5a9e21cfabaa134cb4287bdd0aa50bc007ca14fd"} Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.934022 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="452a29d9d307706cab1d8c8b5a9e21cfabaa134cb4287bdd0aa50bc007ca14fd" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.934119 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-scvbb" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.934569 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc68bd5-hzlg6" Oct 08 18:30:24 crc kubenswrapper[4750]: I1008 18:30:24.986222 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-hzlg6"] Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.029912 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc68bd5-hzlg6"] Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.176640 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:25 crc kubenswrapper[4750]: E1008 18:30:25.177001 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47630cbc-a12d-4112-8f89-4f203bdc9649" containerName="dnsmasq-dns" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.177019 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="47630cbc-a12d-4112-8f89-4f203bdc9649" containerName="dnsmasq-dns" Oct 08 18:30:25 crc kubenswrapper[4750]: E1008 18:30:25.177034 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47630cbc-a12d-4112-8f89-4f203bdc9649" containerName="init" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.177040 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="47630cbc-a12d-4112-8f89-4f203bdc9649" containerName="init" Oct 08 18:30:25 crc kubenswrapper[4750]: E1008 18:30:25.177047 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e2d08e-75b7-445b-b563-affabf6d8af6" containerName="cinder-db-sync" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.177055 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e2d08e-75b7-445b-b563-affabf6d8af6" containerName="cinder-db-sync" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.177254 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="47630cbc-a12d-4112-8f89-4f203bdc9649" containerName="dnsmasq-dns" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.177272 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e2d08e-75b7-445b-b563-affabf6d8af6" containerName="cinder-db-sync" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.178588 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.187258 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s54km" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.187563 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.187669 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.187756 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.203153 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.236845 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-qsxts"] Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.240266 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.277948 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.278006 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/173266a7-3201-41b0-bf79-72d53fd66c2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.278071 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.278091 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6r4\" (UniqueName: \"kubernetes.io/projected/173266a7-3201-41b0-bf79-72d53fd66c2a-kube-api-access-kk6r4\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.278114 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.278155 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.281344 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-qsxts"] Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.344393 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.346278 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.350686 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.373260 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383568 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383628 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmvl\" (UniqueName: \"kubernetes.io/projected/93470753-915e-4675-9ecc-6942de332cd4-kube-api-access-6gmvl\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383655 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/173266a7-3201-41b0-bf79-72d53fd66c2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383704 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383724 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383753 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383775 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6r4\" (UniqueName: \"kubernetes.io/projected/173266a7-3201-41b0-bf79-72d53fd66c2a-kube-api-access-kk6r4\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383845 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383872 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-svc\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383895 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-config\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383910 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.383941 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.385746 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/173266a7-3201-41b0-bf79-72d53fd66c2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.403231 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.403404 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.410474 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.416145 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.417256 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6r4\" (UniqueName: \"kubernetes.io/projected/173266a7-3201-41b0-bf79-72d53fd66c2a-kube-api-access-kk6r4\") pod \"cinder-scheduler-0\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487465 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmvl\" (UniqueName: \"kubernetes.io/projected/93470753-915e-4675-9ecc-6942de332cd4-kube-api-access-6gmvl\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487526 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fe7265-7f86-4acd-95a1-0729ed834f0d-logs\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487588 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fe7265-7f86-4acd-95a1-0729ed834f0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487622 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzsm\" (UniqueName: \"kubernetes.io/projected/19fe7265-7f86-4acd-95a1-0729ed834f0d-kube-api-access-pnzsm\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487651 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487671 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487701 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487734 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-svc\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487752 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-config\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487769 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487788 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487816 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-scripts\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.487858 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.488870 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-nb\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.489338 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-sb\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.489849 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-svc\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.490348 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-config\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.505402 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-swift-storage-0\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.512019 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.524501 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmvl\" (UniqueName: \"kubernetes.io/projected/93470753-915e-4675-9ecc-6942de332cd4-kube-api-access-6gmvl\") pod \"dnsmasq-dns-84bd785c49-qsxts\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.578143 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.589636 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.589678 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-scripts\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.589726 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.589770 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fe7265-7f86-4acd-95a1-0729ed834f0d-logs\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.589804 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fe7265-7f86-4acd-95a1-0729ed834f0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.589831 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzsm\" (UniqueName: \"kubernetes.io/projected/19fe7265-7f86-4acd-95a1-0729ed834f0d-kube-api-access-pnzsm\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.589868 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.590474 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fe7265-7f86-4acd-95a1-0729ed834f0d-logs\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.591774 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fe7265-7f86-4acd-95a1-0729ed834f0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.594683 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.597971 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.599203 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.620199 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-scripts\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.627317 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzsm\" (UniqueName: \"kubernetes.io/projected/19fe7265-7f86-4acd-95a1-0729ed834f0d-kube-api-access-pnzsm\") pod \"cinder-api-0\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.674985 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.952529 4750 generic.go:334] "Generic (PLEG): container finished" podID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerID="4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467" exitCode=143 Oct 08 18:30:25 crc kubenswrapper[4750]: I1008 18:30:25.952826 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" event={"ID":"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15","Type":"ContainerDied","Data":"4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467"} Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.145930 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:26 crc kubenswrapper[4750]: W1008 18:30:26.152028 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod173266a7_3201_41b0_bf79_72d53fd66c2a.slice/crio-1d3fe6e41369e7f9f0fc7b0b8f076b10d7a03b3c78a7e88364bf72a098630996 WatchSource:0}: Error finding container 1d3fe6e41369e7f9f0fc7b0b8f076b10d7a03b3c78a7e88364bf72a098630996: Status 404 returned error can't find the container with id 1d3fe6e41369e7f9f0fc7b0b8f076b10d7a03b3c78a7e88364bf72a098630996 Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.275809 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:26 crc kubenswrapper[4750]: W1008 18:30:26.286677 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19fe7265_7f86_4acd_95a1_0729ed834f0d.slice/crio-62c505b289e16e3be66c0dd3b412405b43e776f7e048d950e81dbcd2595b644a WatchSource:0}: Error finding container 62c505b289e16e3be66c0dd3b412405b43e776f7e048d950e81dbcd2595b644a: Status 404 returned error can't find the container with id 62c505b289e16e3be66c0dd3b412405b43e776f7e048d950e81dbcd2595b644a Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.376991 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-qsxts"] Oct 08 18:30:26 crc kubenswrapper[4750]: W1008 18:30:26.381382 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93470753_915e_4675_9ecc_6942de332cd4.slice/crio-4e29f45e32330d153539c27dcf19fd6eeb652daa9b10aad6cd8dbf6c716c6eb3 WatchSource:0}: Error finding container 4e29f45e32330d153539c27dcf19fd6eeb652daa9b10aad6cd8dbf6c716c6eb3: Status 404 returned error can't find the container with id 4e29f45e32330d153539c27dcf19fd6eeb652daa9b10aad6cd8dbf6c716c6eb3 Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.660248 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.661871 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerName="glance-log" containerID="cri-o://07f77d60fd9c23dd677f1ff78fefb6c3cf307b23e20e0edb4ff4bf60ed6b2d0c" gracePeriod=30 Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.662409 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerName="glance-httpd" containerID="cri-o://38270156d6423a8c5864c7bc1267940a2d7348e6f8289504cdf3ae28b7a1663d" gracePeriod=30 Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.755461 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47630cbc-a12d-4112-8f89-4f203bdc9649" path="/var/lib/kubelet/pods/47630cbc-a12d-4112-8f89-4f203bdc9649/volumes" Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.967885 4750 generic.go:334] "Generic (PLEG): container finished" podID="93470753-915e-4675-9ecc-6942de332cd4" containerID="1ebda8fdacd7b2332560b60b9863fe605e2cd281473db07a2fc870cc4e3902b5" exitCode=0 Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.967967 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" event={"ID":"93470753-915e-4675-9ecc-6942de332cd4","Type":"ContainerDied","Data":"1ebda8fdacd7b2332560b60b9863fe605e2cd281473db07a2fc870cc4e3902b5"} Oct 08 18:30:26 crc kubenswrapper[4750]: I1008 18:30:26.967995 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" event={"ID":"93470753-915e-4675-9ecc-6942de332cd4","Type":"ContainerStarted","Data":"4e29f45e32330d153539c27dcf19fd6eeb652daa9b10aad6cd8dbf6c716c6eb3"} Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.002898 4750 generic.go:334] "Generic (PLEG): container finished" podID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerID="07f77d60fd9c23dd677f1ff78fefb6c3cf307b23e20e0edb4ff4bf60ed6b2d0c" exitCode=143 Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.002957 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4562f82c-cf9e-4b63-bc8a-079a35401232","Type":"ContainerDied","Data":"07f77d60fd9c23dd677f1ff78fefb6c3cf307b23e20e0edb4ff4bf60ed6b2d0c"} Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.013971 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"173266a7-3201-41b0-bf79-72d53fd66c2a","Type":"ContainerStarted","Data":"1d3fe6e41369e7f9f0fc7b0b8f076b10d7a03b3c78a7e88364bf72a098630996"} Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.017657 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fe7265-7f86-4acd-95a1-0729ed834f0d","Type":"ContainerStarted","Data":"62c505b289e16e3be66c0dd3b412405b43e776f7e048d950e81dbcd2595b644a"} Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.069503 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.478124 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.478670 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="ceilometer-central-agent" containerID="cri-o://a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa" gracePeriod=30 Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.478797 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="proxy-httpd" containerID="cri-o://643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4" gracePeriod=30 Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.479024 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="ceilometer-notification-agent" containerID="cri-o://449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f" gracePeriod=30 Oct 08 18:30:27 crc kubenswrapper[4750]: I1008 18:30:27.481268 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="sg-core" containerID="cri-o://7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99" gracePeriod=30 Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.042989 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" event={"ID":"93470753-915e-4675-9ecc-6942de332cd4","Type":"ContainerStarted","Data":"e491d282a1964ea6486700cedfdc02fe6e80d07443042915a7c5898b1fe1097c"} Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.044408 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.048132 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"173266a7-3201-41b0-bf79-72d53fd66c2a","Type":"ContainerStarted","Data":"5e638894504edd794d7ce2c06f083eaa1e1b56d38dfb14d325c0dcf4b159293a"} Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.054448 4750 generic.go:334] "Generic (PLEG): container finished" podID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerID="643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4" exitCode=0 Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.054476 4750 generic.go:334] "Generic (PLEG): container finished" podID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerID="7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99" exitCode=2 Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.054484 4750 generic.go:334] "Generic (PLEG): container finished" podID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerID="a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa" exitCode=0 Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.054520 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerDied","Data":"643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4"} Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.054541 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerDied","Data":"7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99"} Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.054564 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerDied","Data":"a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa"} Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.057179 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fe7265-7f86-4acd-95a1-0729ed834f0d","Type":"ContainerStarted","Data":"9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c"} Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.057204 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fe7265-7f86-4acd-95a1-0729ed834f0d","Type":"ContainerStarted","Data":"71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897"} Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.057326 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerName="cinder-api-log" containerID="cri-o://71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897" gracePeriod=30 Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.057404 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.057434 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerName="cinder-api" containerID="cri-o://9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c" gracePeriod=30 Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.061537 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" podStartSLOduration=3.061526219 podStartE2EDuration="3.061526219s" podCreationTimestamp="2025-10-08 18:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:28.058034183 +0000 UTC m=+1183.971005196" watchObservedRunningTime="2025-10-08 18:30:28.061526219 +0000 UTC m=+1183.974497232" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.458419 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.458402632 podStartE2EDuration="3.458402632s" podCreationTimestamp="2025-10-08 18:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:28.084630695 +0000 UTC m=+1183.997601708" watchObservedRunningTime="2025-10-08 18:30:28.458402632 +0000 UTC m=+1184.371373645" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.465015 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-74b95c857c-677wg"] Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.471760 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.474539 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.474728 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.476523 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.503162 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74b95c857c-677wg"] Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.563529 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-etc-swift\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.563592 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-config-data\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.563613 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-internal-tls-certs\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.563636 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-run-httpd\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.563722 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqfql\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-kube-api-access-wqfql\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.563805 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-log-httpd\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.563831 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-public-tls-certs\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.563857 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-combined-ca-bundle\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.601704 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665150 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b2sv\" (UniqueName: \"kubernetes.io/projected/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-kube-api-access-8b2sv\") pod \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665209 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-logs\") pod \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665349 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data-custom\") pod \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665431 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data\") pod \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665457 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-combined-ca-bundle\") pod \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\" (UID: \"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665683 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-internal-tls-certs\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665704 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-run-httpd\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665756 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqfql\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-kube-api-access-wqfql\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.665881 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-logs" (OuterVolumeSpecName: "logs") pod "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" (UID: "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.666440 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-log-httpd\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.666487 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-public-tls-certs\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.666516 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-combined-ca-bundle\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.666623 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-etc-swift\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.666646 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-config-data\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.667942 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.668381 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-run-httpd\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.670310 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-log-httpd\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.674639 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-internal-tls-certs\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.674696 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-public-tls-certs\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.675793 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" (UID: "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.676574 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-combined-ca-bundle\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.676769 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-kube-api-access-8b2sv" (OuterVolumeSpecName: "kube-api-access-8b2sv") pod "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" (UID: "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15"). InnerVolumeSpecName "kube-api-access-8b2sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.677457 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-etc-swift\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.688620 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-config-data\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.706260 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqfql\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-kube-api-access-wqfql\") pod \"swift-proxy-74b95c857c-677wg\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.714610 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" (UID: "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.750442 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data" (OuterVolumeSpecName: "config-data") pod "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" (UID: "2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.769896 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b2sv\" (UniqueName: \"kubernetes.io/projected/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-kube-api-access-8b2sv\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.769930 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.769944 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.769956 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.854954 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.871657 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddgbr\" (UniqueName: \"kubernetes.io/projected/b09ec256-29a4-497c-a23c-eef4c9cd47c8-kube-api-access-ddgbr\") pod \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.871858 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-log-httpd\") pod \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.871904 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-sg-core-conf-yaml\") pod \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.871956 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-scripts\") pod \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.872041 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-run-httpd\") pod \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.872106 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-combined-ca-bundle\") pod \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.872169 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-config-data\") pod \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\" (UID: \"b09ec256-29a4-497c-a23c-eef4c9cd47c8\") " Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.872495 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b09ec256-29a4-497c-a23c-eef4c9cd47c8" (UID: "b09ec256-29a4-497c-a23c-eef4c9cd47c8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.874380 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b09ec256-29a4-497c-a23c-eef4c9cd47c8" (UID: "b09ec256-29a4-497c-a23c-eef4c9cd47c8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.880696 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-scripts" (OuterVolumeSpecName: "scripts") pod "b09ec256-29a4-497c-a23c-eef4c9cd47c8" (UID: "b09ec256-29a4-497c-a23c-eef4c9cd47c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.881087 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09ec256-29a4-497c-a23c-eef4c9cd47c8-kube-api-access-ddgbr" (OuterVolumeSpecName: "kube-api-access-ddgbr") pod "b09ec256-29a4-497c-a23c-eef4c9cd47c8" (UID: "b09ec256-29a4-497c-a23c-eef4c9cd47c8"). InnerVolumeSpecName "kube-api-access-ddgbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.900067 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.905705 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b09ec256-29a4-497c-a23c-eef4c9cd47c8" (UID: "b09ec256-29a4-497c-a23c-eef4c9cd47c8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.975155 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddgbr\" (UniqueName: \"kubernetes.io/projected/b09ec256-29a4-497c-a23c-eef4c9cd47c8-kube-api-access-ddgbr\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.975196 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.975209 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.975220 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.975231 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b09ec256-29a4-497c-a23c-eef4c9cd47c8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.976508 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b09ec256-29a4-497c-a23c-eef4c9cd47c8" (UID: "b09ec256-29a4-497c-a23c-eef4c9cd47c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:28 crc kubenswrapper[4750]: I1008 18:30:28.993711 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-config-data" (OuterVolumeSpecName: "config-data") pod "b09ec256-29a4-497c-a23c-eef4c9cd47c8" (UID: "b09ec256-29a4-497c-a23c-eef4c9cd47c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.076124 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.076413 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09ec256-29a4-497c-a23c-eef4c9cd47c8-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.083251 4750 generic.go:334] "Generic (PLEG): container finished" podID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerID="84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf" exitCode=0 Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.083300 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" event={"ID":"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15","Type":"ContainerDied","Data":"84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf"} Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.083329 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" event={"ID":"2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15","Type":"ContainerDied","Data":"0e0b72be0fa7bcd6c46bf3aca3ede9f21bd6a744612403cb0ad79ecf20b53e42"} Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.083345 4750 scope.go:117] "RemoveContainer" containerID="84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.083435 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b5b4bdc4-tgxnj" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.102397 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"173266a7-3201-41b0-bf79-72d53fd66c2a","Type":"ContainerStarted","Data":"f4e83d2e05f58b0eee54ea56e59dc9b3132704a6e200a10d9b647d4a0023fb44"} Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.116177 4750 generic.go:334] "Generic (PLEG): container finished" podID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerID="449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f" exitCode=0 Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.116238 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerDied","Data":"449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f"} Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.116316 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b09ec256-29a4-497c-a23c-eef4c9cd47c8","Type":"ContainerDied","Data":"a0add41a6aef45c34e9eaf41b006c76f23e1bee22bfbeb15b9ec05de1bc1146d"} Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.116384 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.120627 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84b5b4bdc4-tgxnj"] Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.125314 4750 generic.go:334] "Generic (PLEG): container finished" podID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerID="71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897" exitCode=143 Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.126933 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fe7265-7f86-4acd-95a1-0729ed834f0d","Type":"ContainerDied","Data":"71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897"} Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.131755 4750 scope.go:117] "RemoveContainer" containerID="4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.136410 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-84b5b4bdc4-tgxnj"] Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.147108 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.233874362 podStartE2EDuration="4.147086996s" podCreationTimestamp="2025-10-08 18:30:25 +0000 UTC" firstStartedPulling="2025-10-08 18:30:26.157802736 +0000 UTC m=+1182.070773749" lastFinishedPulling="2025-10-08 18:30:27.07101537 +0000 UTC m=+1182.983986383" observedRunningTime="2025-10-08 18:30:29.120929195 +0000 UTC m=+1185.033900228" watchObservedRunningTime="2025-10-08 18:30:29.147086996 +0000 UTC m=+1185.060058009" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.190117 4750 scope.go:117] "RemoveContainer" containerID="84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.199329 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf\": container with ID starting with 84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf not found: ID does not exist" containerID="84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.199372 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf"} err="failed to get container status \"84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf\": rpc error: code = NotFound desc = could not find container \"84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf\": container with ID starting with 84581a0cc525dfa4e10d17d674b20e53250a48975865466afd060d5ffe3f0edf not found: ID does not exist" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.199396 4750 scope.go:117] "RemoveContainer" containerID="4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.200008 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467\": container with ID starting with 4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467 not found: ID does not exist" containerID="4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.200034 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467"} err="failed to get container status \"4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467\": rpc error: code = NotFound desc = could not find container \"4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467\": container with ID starting with 4332f3ddc51ef6473a87ed30ca245bc270e0920c831c2122d5168973241da467 not found: ID does not exist" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.200051 4750 scope.go:117] "RemoveContainer" containerID="643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.216080 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.236851 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.255967 4750 scope.go:117] "RemoveContainer" containerID="7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256090 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.256437 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="sg-core" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256449 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="sg-core" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.256466 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="ceilometer-central-agent" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256474 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="ceilometer-central-agent" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.256488 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="proxy-httpd" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256495 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="proxy-httpd" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.256510 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerName="barbican-api" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256515 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerName="barbican-api" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.256525 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerName="barbican-api-log" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256531 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerName="barbican-api-log" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.256560 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="ceilometer-notification-agent" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256566 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="ceilometer-notification-agent" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256739 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="proxy-httpd" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256750 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="ceilometer-notification-agent" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256765 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerName="barbican-api" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256771 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="ceilometer-central-agent" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256781 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" containerName="sg-core" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.256793 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" containerName="barbican-api-log" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.259499 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.262409 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.262681 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.273970 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.283226 4750 scope.go:117] "RemoveContainer" containerID="449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.303224 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-run-httpd\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.303649 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-scripts\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.303809 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.303984 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.304095 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-config-data\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.304257 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldjfq\" (UniqueName: \"kubernetes.io/projected/69942e34-56b9-44e8-a3ea-025aaef2bcb7-kube-api-access-ldjfq\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.304598 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-log-httpd\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.307153 4750 scope.go:117] "RemoveContainer" containerID="a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.333803 4750 scope.go:117] "RemoveContainer" containerID="643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.334409 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4\": container with ID starting with 643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4 not found: ID does not exist" containerID="643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.334451 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4"} err="failed to get container status \"643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4\": rpc error: code = NotFound desc = could not find container \"643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4\": container with ID starting with 643e190066b829a3083477da09f84a967e35d9d74ad7ccefb591fccad3bd45f4 not found: ID does not exist" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.334476 4750 scope.go:117] "RemoveContainer" containerID="7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.335036 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99\": container with ID starting with 7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99 not found: ID does not exist" containerID="7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.335079 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99"} err="failed to get container status \"7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99\": rpc error: code = NotFound desc = could not find container \"7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99\": container with ID starting with 7823d1e431dc2a1096cdb43de1b9c21e22ffbea3005521e897408676d332fb99 not found: ID does not exist" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.335109 4750 scope.go:117] "RemoveContainer" containerID="449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.336325 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f\": container with ID starting with 449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f not found: ID does not exist" containerID="449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.336351 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f"} err="failed to get container status \"449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f\": rpc error: code = NotFound desc = could not find container \"449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f\": container with ID starting with 449568f0c50c9704c3f47a5824cc562e8d8af6eba1cf615d282d1b577637bc7f not found: ID does not exist" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.336367 4750 scope.go:117] "RemoveContainer" containerID="a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa" Oct 08 18:30:29 crc kubenswrapper[4750]: E1008 18:30:29.336893 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa\": container with ID starting with a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa not found: ID does not exist" containerID="a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.337026 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa"} err="failed to get container status \"a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa\": rpc error: code = NotFound desc = could not find container \"a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa\": container with ID starting with a6c5333f97073c7e0a7764e13969d39084a115cf6189c8ed7e25a6e119e5c1aa not found: ID does not exist" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.405466 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.405509 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-config-data\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.405527 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldjfq\" (UniqueName: \"kubernetes.io/projected/69942e34-56b9-44e8-a3ea-025aaef2bcb7-kube-api-access-ldjfq\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.405605 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-log-httpd\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.405639 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-run-httpd\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.405659 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-scripts\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.405695 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.406493 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-log-httpd\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.407357 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-run-httpd\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.410186 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-config-data\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.410742 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.412861 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-scripts\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.418979 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.427269 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldjfq\" (UniqueName: \"kubernetes.io/projected/69942e34-56b9-44e8-a3ea-025aaef2bcb7-kube-api-access-ldjfq\") pod \"ceilometer-0\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.465629 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74b95c857c-677wg"] Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.584341 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.839388 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.842200 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-log" containerID="cri-o://bfd9719615835450521b7511f861f62cb2bb85abf40f24ff9d817e7a2d193db4" gracePeriod=30 Oct 08 18:30:29 crc kubenswrapper[4750]: I1008 18:30:29.842648 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-httpd" containerID="cri-o://2bebcd680b17c03745fe629ed9bbe0c732bfc09be9c86b9e75d9c81f409a018a" gracePeriod=30 Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.124797 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.139832 4750 generic.go:334] "Generic (PLEG): container finished" podID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerID="38270156d6423a8c5864c7bc1267940a2d7348e6f8289504cdf3ae28b7a1663d" exitCode=0 Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.139885 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4562f82c-cf9e-4b63-bc8a-079a35401232","Type":"ContainerDied","Data":"38270156d6423a8c5864c7bc1267940a2d7348e6f8289504cdf3ae28b7a1663d"} Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.141092 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74b95c857c-677wg" event={"ID":"c218b865-c7d1-4f46-ad6d-8e102b6af491","Type":"ContainerStarted","Data":"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61"} Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.141124 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74b95c857c-677wg" event={"ID":"c218b865-c7d1-4f46-ad6d-8e102b6af491","Type":"ContainerStarted","Data":"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f"} Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.141133 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74b95c857c-677wg" event={"ID":"c218b865-c7d1-4f46-ad6d-8e102b6af491","Type":"ContainerStarted","Data":"8a607a1ae0f7352862e8ea719dfdd3057a21089276a0355a348ce53dcaba80ef"} Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.141377 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.141431 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.144517 4750 generic.go:334] "Generic (PLEG): container finished" podID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerID="bfd9719615835450521b7511f861f62cb2bb85abf40f24ff9d817e7a2d193db4" exitCode=143 Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.150687 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"014b5ab1-7e11-42b6-b4ef-c3dba510298f","Type":"ContainerDied","Data":"bfd9719615835450521b7511f861f62cb2bb85abf40f24ff9d817e7a2d193db4"} Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.157338 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-74b95c857c-677wg" podStartSLOduration=2.157323658 podStartE2EDuration="2.157323658s" podCreationTimestamp="2025-10-08 18:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:30.156802145 +0000 UTC m=+1186.069773158" watchObservedRunningTime="2025-10-08 18:30:30.157323658 +0000 UTC m=+1186.070294671" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.360032 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.447731 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.513252 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.629979 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-config-data\") pod \"4562f82c-cf9e-4b63-bc8a-079a35401232\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.630049 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-combined-ca-bundle\") pod \"4562f82c-cf9e-4b63-bc8a-079a35401232\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.630125 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"4562f82c-cf9e-4b63-bc8a-079a35401232\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.630148 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-logs\") pod \"4562f82c-cf9e-4b63-bc8a-079a35401232\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.630178 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-scripts\") pod \"4562f82c-cf9e-4b63-bc8a-079a35401232\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.630203 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcbv6\" (UniqueName: \"kubernetes.io/projected/4562f82c-cf9e-4b63-bc8a-079a35401232-kube-api-access-mcbv6\") pod \"4562f82c-cf9e-4b63-bc8a-079a35401232\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.630298 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-httpd-run\") pod \"4562f82c-cf9e-4b63-bc8a-079a35401232\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.630351 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-public-tls-certs\") pod \"4562f82c-cf9e-4b63-bc8a-079a35401232\" (UID: \"4562f82c-cf9e-4b63-bc8a-079a35401232\") " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.630642 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-logs" (OuterVolumeSpecName: "logs") pod "4562f82c-cf9e-4b63-bc8a-079a35401232" (UID: "4562f82c-cf9e-4b63-bc8a-079a35401232"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.631042 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.631064 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4562f82c-cf9e-4b63-bc8a-079a35401232" (UID: "4562f82c-cf9e-4b63-bc8a-079a35401232"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.638046 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "4562f82c-cf9e-4b63-bc8a-079a35401232" (UID: "4562f82c-cf9e-4b63-bc8a-079a35401232"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.638246 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-scripts" (OuterVolumeSpecName: "scripts") pod "4562f82c-cf9e-4b63-bc8a-079a35401232" (UID: "4562f82c-cf9e-4b63-bc8a-079a35401232"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.642273 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4562f82c-cf9e-4b63-bc8a-079a35401232-kube-api-access-mcbv6" (OuterVolumeSpecName: "kube-api-access-mcbv6") pod "4562f82c-cf9e-4b63-bc8a-079a35401232" (UID: "4562f82c-cf9e-4b63-bc8a-079a35401232"). InnerVolumeSpecName "kube-api-access-mcbv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.681703 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4562f82c-cf9e-4b63-bc8a-079a35401232" (UID: "4562f82c-cf9e-4b63-bc8a-079a35401232"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.713707 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4562f82c-cf9e-4b63-bc8a-079a35401232" (UID: "4562f82c-cf9e-4b63-bc8a-079a35401232"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.732659 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-config-data" (OuterVolumeSpecName: "config-data") pod "4562f82c-cf9e-4b63-bc8a-079a35401232" (UID: "4562f82c-cf9e-4b63-bc8a-079a35401232"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.734186 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.734210 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.734242 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.734253 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.734262 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcbv6\" (UniqueName: \"kubernetes.io/projected/4562f82c-cf9e-4b63-bc8a-079a35401232-kube-api-access-mcbv6\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.734272 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4562f82c-cf9e-4b63-bc8a-079a35401232-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.734282 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4562f82c-cf9e-4b63-bc8a-079a35401232-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.748646 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15" path="/var/lib/kubelet/pods/2bcdb7bc-2242-4e1d-bbef-1c2d99edbd15/volumes" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.749294 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b09ec256-29a4-497c-a23c-eef4c9cd47c8" path="/var/lib/kubelet/pods/b09ec256-29a4-497c-a23c-eef4c9cd47c8/volumes" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.768360 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.836292 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.847234 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ghwp7"] Oct 08 18:30:30 crc kubenswrapper[4750]: E1008 18:30:30.847627 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerName="glance-log" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.847647 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerName="glance-log" Oct 08 18:30:30 crc kubenswrapper[4750]: E1008 18:30:30.847684 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerName="glance-httpd" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.847691 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerName="glance-httpd" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.847920 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerName="glance-httpd" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.847941 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" containerName="glance-log" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.848503 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ghwp7" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.886734 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ghwp7"] Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.955083 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tqdjh"] Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.957450 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tqdjh" Oct 08 18:30:30 crc kubenswrapper[4750]: I1008 18:30:30.966589 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tqdjh"] Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.043071 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92mfm\" (UniqueName: \"kubernetes.io/projected/154374c5-fc90-40da-9ac7-a98f99aca0a1-kube-api-access-92mfm\") pod \"nova-api-db-create-ghwp7\" (UID: \"154374c5-fc90-40da-9ac7-a98f99aca0a1\") " pod="openstack/nova-api-db-create-ghwp7" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.063192 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ljvpd"] Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.065197 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ljvpd" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.106130 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ljvpd"] Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.145101 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdnmx\" (UniqueName: \"kubernetes.io/projected/41591e4c-def9-4152-8bec-7c47ed4367e8-kube-api-access-hdnmx\") pod \"nova-cell0-db-create-tqdjh\" (UID: \"41591e4c-def9-4152-8bec-7c47ed4367e8\") " pod="openstack/nova-cell0-db-create-tqdjh" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.145176 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92mfm\" (UniqueName: \"kubernetes.io/projected/154374c5-fc90-40da-9ac7-a98f99aca0a1-kube-api-access-92mfm\") pod \"nova-api-db-create-ghwp7\" (UID: \"154374c5-fc90-40da-9ac7-a98f99aca0a1\") " pod="openstack/nova-api-db-create-ghwp7" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.157692 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerStarted","Data":"4e3a3c3bd972dc06625721a5713b1696557110f3b25a44dd0c3949447f86e466"} Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.165138 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92mfm\" (UniqueName: \"kubernetes.io/projected/154374c5-fc90-40da-9ac7-a98f99aca0a1-kube-api-access-92mfm\") pod \"nova-api-db-create-ghwp7\" (UID: \"154374c5-fc90-40da-9ac7-a98f99aca0a1\") " pod="openstack/nova-api-db-create-ghwp7" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.166478 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4562f82c-cf9e-4b63-bc8a-079a35401232","Type":"ContainerDied","Data":"ffa5cfecc1d3ff393e180d97588f990caaf57a734c75baf7e08d86acd23002d7"} Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.166516 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.166532 4750 scope.go:117] "RemoveContainer" containerID="38270156d6423a8c5864c7bc1267940a2d7348e6f8289504cdf3ae28b7a1663d" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.172245 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ghwp7" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.245321 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.246402 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdnmx\" (UniqueName: \"kubernetes.io/projected/41591e4c-def9-4152-8bec-7c47ed4367e8-kube-api-access-hdnmx\") pod \"nova-cell0-db-create-tqdjh\" (UID: \"41591e4c-def9-4152-8bec-7c47ed4367e8\") " pod="openstack/nova-cell0-db-create-tqdjh" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.246455 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fcdk\" (UniqueName: \"kubernetes.io/projected/3ab07eac-5578-43a8-979b-d3dba99ce3ba-kube-api-access-9fcdk\") pod \"nova-cell1-db-create-ljvpd\" (UID: \"3ab07eac-5578-43a8-979b-d3dba99ce3ba\") " pod="openstack/nova-cell1-db-create-ljvpd" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.262780 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.266116 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.267680 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.270217 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdnmx\" (UniqueName: \"kubernetes.io/projected/41591e4c-def9-4152-8bec-7c47ed4367e8-kube-api-access-hdnmx\") pod \"nova-cell0-db-create-tqdjh\" (UID: \"41591e4c-def9-4152-8bec-7c47ed4367e8\") " pod="openstack/nova-cell0-db-create-tqdjh" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.271993 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.272321 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.288149 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tqdjh" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.288315 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.348507 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fcdk\" (UniqueName: \"kubernetes.io/projected/3ab07eac-5578-43a8-979b-d3dba99ce3ba-kube-api-access-9fcdk\") pod \"nova-cell1-db-create-ljvpd\" (UID: \"3ab07eac-5578-43a8-979b-d3dba99ce3ba\") " pod="openstack/nova-cell1-db-create-ljvpd" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.363938 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fcdk\" (UniqueName: \"kubernetes.io/projected/3ab07eac-5578-43a8-979b-d3dba99ce3ba-kube-api-access-9fcdk\") pod \"nova-cell1-db-create-ljvpd\" (UID: \"3ab07eac-5578-43a8-979b-d3dba99ce3ba\") " pod="openstack/nova-cell1-db-create-ljvpd" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.395386 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ljvpd" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.450411 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.450512 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.450543 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.450581 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrnf\" (UniqueName: \"kubernetes.io/projected/a25ebe44-c330-48f8-9df7-5f8517cd96bd-kube-api-access-vbrnf\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.450603 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-scripts\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.450624 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.450650 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-logs\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.450696 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-config-data\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552004 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552068 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552086 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrnf\" (UniqueName: \"kubernetes.io/projected/a25ebe44-c330-48f8-9df7-5f8517cd96bd-kube-api-access-vbrnf\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552103 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-scripts\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552127 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552150 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-logs\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552200 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-config-data\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552234 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552326 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552715 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.552965 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-logs\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.557394 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-scripts\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.566474 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.566673 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.567254 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-config-data\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.570896 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrnf\" (UniqueName: \"kubernetes.io/projected/a25ebe44-c330-48f8-9df7-5f8517cd96bd-kube-api-access-vbrnf\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.601271 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " pod="openstack/glance-default-external-api-0" Oct 08 18:30:31 crc kubenswrapper[4750]: I1008 18:30:31.891432 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:30:32 crc kubenswrapper[4750]: I1008 18:30:32.751142 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4562f82c-cf9e-4b63-bc8a-079a35401232" path="/var/lib/kubelet/pods/4562f82c-cf9e-4b63-bc8a-079a35401232/volumes" Oct 08 18:30:32 crc kubenswrapper[4750]: I1008 18:30:32.996263 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:40618->10.217.0.153:9292: read: connection reset by peer" Oct 08 18:30:32 crc kubenswrapper[4750]: I1008 18:30:32.996350 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:40624->10.217.0.153:9292: read: connection reset by peer" Oct 08 18:30:33 crc kubenswrapper[4750]: I1008 18:30:33.187707 4750 generic.go:334] "Generic (PLEG): container finished" podID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerID="2bebcd680b17c03745fe629ed9bbe0c732bfc09be9c86b9e75d9c81f409a018a" exitCode=0 Oct 08 18:30:33 crc kubenswrapper[4750]: I1008 18:30:33.187750 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"014b5ab1-7e11-42b6-b4ef-c3dba510298f","Type":"ContainerDied","Data":"2bebcd680b17c03745fe629ed9bbe0c732bfc09be9c86b9e75d9c81f409a018a"} Oct 08 18:30:35 crc kubenswrapper[4750]: I1008 18:30:35.580727 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:30:35 crc kubenswrapper[4750]: I1008 18:30:35.637232 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-xttcf"] Oct 08 18:30:35 crc kubenswrapper[4750]: I1008 18:30:35.637526 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" podUID="5a44a1f2-be71-4ae0-b115-9ca979c931be" containerName="dnsmasq-dns" containerID="cri-o://636c0fd3299ccd708561f50ef31954da172919d0ab9e81edc53a06d0698f61da" gracePeriod=10 Oct 08 18:30:35 crc kubenswrapper[4750]: I1008 18:30:35.894201 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 18:30:35 crc kubenswrapper[4750]: I1008 18:30:35.943005 4750 scope.go:117] "RemoveContainer" containerID="07f77d60fd9c23dd677f1ff78fefb6c3cf307b23e20e0edb4ff4bf60ed6b2d0c" Oct 08 18:30:35 crc kubenswrapper[4750]: I1008 18:30:35.955062 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.223164 4750 generic.go:334] "Generic (PLEG): container finished" podID="5a44a1f2-be71-4ae0-b115-9ca979c931be" containerID="636c0fd3299ccd708561f50ef31954da172919d0ab9e81edc53a06d0698f61da" exitCode=0 Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.223241 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" event={"ID":"5a44a1f2-be71-4ae0-b115-9ca979c931be","Type":"ContainerDied","Data":"636c0fd3299ccd708561f50ef31954da172919d0ab9e81edc53a06d0698f61da"} Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.226230 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerName="cinder-scheduler" containerID="cri-o://5e638894504edd794d7ce2c06f083eaa1e1b56d38dfb14d325c0dcf4b159293a" gracePeriod=30 Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.226782 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerName="probe" containerID="cri-o://f4e83d2e05f58b0eee54ea56e59dc9b3132704a6e200a10d9b647d4a0023fb44" gracePeriod=30 Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.353079 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.460849 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-config\") pod \"5a44a1f2-be71-4ae0-b115-9ca979c931be\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.461232 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-svc\") pod \"5a44a1f2-be71-4ae0-b115-9ca979c931be\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.461406 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-sb\") pod \"5a44a1f2-be71-4ae0-b115-9ca979c931be\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.461567 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-nb\") pod \"5a44a1f2-be71-4ae0-b115-9ca979c931be\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.461844 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9qgn\" (UniqueName: \"kubernetes.io/projected/5a44a1f2-be71-4ae0-b115-9ca979c931be-kube-api-access-g9qgn\") pod \"5a44a1f2-be71-4ae0-b115-9ca979c931be\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.462049 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-swift-storage-0\") pod \"5a44a1f2-be71-4ae0-b115-9ca979c931be\" (UID: \"5a44a1f2-be71-4ae0-b115-9ca979c931be\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.484620 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.487058 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a44a1f2-be71-4ae0-b115-9ca979c931be-kube-api-access-g9qgn" (OuterVolumeSpecName: "kube-api-access-g9qgn") pod "5a44a1f2-be71-4ae0-b115-9ca979c931be" (UID: "5a44a1f2-be71-4ae0-b115-9ca979c931be"). InnerVolumeSpecName "kube-api-access-g9qgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.565947 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9qgn\" (UniqueName: \"kubernetes.io/projected/5a44a1f2-be71-4ae0-b115-9ca979c931be-kube-api-access-g9qgn\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.577401 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5a44a1f2-be71-4ae0-b115-9ca979c931be" (UID: "5a44a1f2-be71-4ae0-b115-9ca979c931be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.577946 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-config" (OuterVolumeSpecName: "config") pod "5a44a1f2-be71-4ae0-b115-9ca979c931be" (UID: "5a44a1f2-be71-4ae0-b115-9ca979c931be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.613534 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5a44a1f2-be71-4ae0-b115-9ca979c931be" (UID: "5a44a1f2-be71-4ae0-b115-9ca979c931be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.615811 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5a44a1f2-be71-4ae0-b115-9ca979c931be" (UID: "5a44a1f2-be71-4ae0-b115-9ca979c931be"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.619672 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ghwp7"] Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.639068 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a44a1f2-be71-4ae0-b115-9ca979c931be" (UID: "5a44a1f2-be71-4ae0-b115-9ca979c931be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670168 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksg6k\" (UniqueName: \"kubernetes.io/projected/014b5ab1-7e11-42b6-b4ef-c3dba510298f-kube-api-access-ksg6k\") pod \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670240 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-internal-tls-certs\") pod \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670318 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670391 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-config-data\") pod \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670442 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-combined-ca-bundle\") pod \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670470 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-scripts\") pod \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670508 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-httpd-run\") pod \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670586 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-logs\") pod \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\" (UID: \"014b5ab1-7e11-42b6-b4ef-c3dba510298f\") " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670943 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670962 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670971 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670980 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.670987 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a44a1f2-be71-4ae0-b115-9ca979c931be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.671315 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-logs" (OuterVolumeSpecName: "logs") pod "014b5ab1-7e11-42b6-b4ef-c3dba510298f" (UID: "014b5ab1-7e11-42b6-b4ef-c3dba510298f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.671600 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "014b5ab1-7e11-42b6-b4ef-c3dba510298f" (UID: "014b5ab1-7e11-42b6-b4ef-c3dba510298f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.684476 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-scripts" (OuterVolumeSpecName: "scripts") pod "014b5ab1-7e11-42b6-b4ef-c3dba510298f" (UID: "014b5ab1-7e11-42b6-b4ef-c3dba510298f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.684590 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "014b5ab1-7e11-42b6-b4ef-c3dba510298f" (UID: "014b5ab1-7e11-42b6-b4ef-c3dba510298f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.684957 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014b5ab1-7e11-42b6-b4ef-c3dba510298f-kube-api-access-ksg6k" (OuterVolumeSpecName: "kube-api-access-ksg6k") pod "014b5ab1-7e11-42b6-b4ef-c3dba510298f" (UID: "014b5ab1-7e11-42b6-b4ef-c3dba510298f"). InnerVolumeSpecName "kube-api-access-ksg6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.736341 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-config-data" (OuterVolumeSpecName: "config-data") pod "014b5ab1-7e11-42b6-b4ef-c3dba510298f" (UID: "014b5ab1-7e11-42b6-b4ef-c3dba510298f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.750407 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "014b5ab1-7e11-42b6-b4ef-c3dba510298f" (UID: "014b5ab1-7e11-42b6-b4ef-c3dba510298f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.751071 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "014b5ab1-7e11-42b6-b4ef-c3dba510298f" (UID: "014b5ab1-7e11-42b6-b4ef-c3dba510298f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.773261 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.774422 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.774848 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.774934 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.775077 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014b5ab1-7e11-42b6-b4ef-c3dba510298f-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.775180 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksg6k\" (UniqueName: \"kubernetes.io/projected/014b5ab1-7e11-42b6-b4ef-c3dba510298f-kube-api-access-ksg6k\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.775264 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/014b5ab1-7e11-42b6-b4ef-c3dba510298f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.775357 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.814769 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.873668 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ljvpd"] Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.878259 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.893380 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tqdjh"] Oct 08 18:30:36 crc kubenswrapper[4750]: I1008 18:30:36.994431 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.244998 4750 generic.go:334] "Generic (PLEG): container finished" podID="154374c5-fc90-40da-9ac7-a98f99aca0a1" containerID="3a4d37b7d84777e50274ee7ecd6a495b1deb334ff5445b8593c34b3c4c6874f5" exitCode=0 Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.245062 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ghwp7" event={"ID":"154374c5-fc90-40da-9ac7-a98f99aca0a1","Type":"ContainerDied","Data":"3a4d37b7d84777e50274ee7ecd6a495b1deb334ff5445b8593c34b3c4c6874f5"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.245086 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ghwp7" event={"ID":"154374c5-fc90-40da-9ac7-a98f99aca0a1","Type":"ContainerStarted","Data":"ca4c1e9f537fd11a511199a2cf3636be5be163f641b1091f8517ba168081ebb1"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.246827 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tqdjh" event={"ID":"41591e4c-def9-4152-8bec-7c47ed4367e8","Type":"ContainerStarted","Data":"7b203fa29d56c786fbbfd77e88e6cc33a3546e496994e2ee4124b4307f75912d"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.252243 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ljvpd" event={"ID":"3ab07eac-5578-43a8-979b-d3dba99ce3ba","Type":"ContainerStarted","Data":"7ea0afff5bdb9f582859dc596ceccf1118ad346f839baedefa4367e2be82fa4a"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.252287 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ljvpd" event={"ID":"3ab07eac-5578-43a8-979b-d3dba99ce3ba","Type":"ContainerStarted","Data":"56ab38b457cc7f771d906c9e6cff3a39178244aabf9223b3b03ef9d5905e5729"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.254903 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerStarted","Data":"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.280175 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-ljvpd" podStartSLOduration=6.280158064 podStartE2EDuration="6.280158064s" podCreationTimestamp="2025-10-08 18:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:37.270041196 +0000 UTC m=+1193.183012229" watchObservedRunningTime="2025-10-08 18:30:37.280158064 +0000 UTC m=+1193.193129077" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.280620 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"014b5ab1-7e11-42b6-b4ef-c3dba510298f","Type":"ContainerDied","Data":"5ae3cd11d3ed6737715a8fd72e49a3fbcc66817aa8d3bfbb2acdc67dfea4dfd2"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.280664 4750 scope.go:117] "RemoveContainer" containerID="2bebcd680b17c03745fe629ed9bbe0c732bfc09be9c86b9e75d9c81f409a018a" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.280796 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.300742 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" event={"ID":"5a44a1f2-be71-4ae0-b115-9ca979c931be","Type":"ContainerDied","Data":"865dbac6ce37b74ef33df3db129d708788d0b7dfac77f48e171969df671b386b"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.300848 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c78787df7-xttcf" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.307488 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"845a93a1-bf5f-4820-a580-e01d2ed59416","Type":"ContainerStarted","Data":"be3fa37eeeee66c52135106d0b32e1df0c8a601b9cfff61c6d2080884d388f7a"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.310384 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a25ebe44-c330-48f8-9df7-5f8517cd96bd","Type":"ContainerStarted","Data":"b5d1eb65b8a7dbfa308c5f325abdf0c17ea06ea08ecbd28b4eab4670f53030dd"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.313408 4750 generic.go:334] "Generic (PLEG): container finished" podID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerID="f4e83d2e05f58b0eee54ea56e59dc9b3132704a6e200a10d9b647d4a0023fb44" exitCode=0 Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.313467 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"173266a7-3201-41b0-bf79-72d53fd66c2a","Type":"ContainerDied","Data":"f4e83d2e05f58b0eee54ea56e59dc9b3132704a6e200a10d9b647d4a0023fb44"} Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.332218 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.284493456 podStartE2EDuration="17.332194859s" podCreationTimestamp="2025-10-08 18:30:20 +0000 UTC" firstStartedPulling="2025-10-08 18:30:21.023680935 +0000 UTC m=+1176.936651938" lastFinishedPulling="2025-10-08 18:30:36.071382328 +0000 UTC m=+1191.984353341" observedRunningTime="2025-10-08 18:30:37.325853323 +0000 UTC m=+1193.238824336" watchObservedRunningTime="2025-10-08 18:30:37.332194859 +0000 UTC m=+1193.245165872" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.359093 4750 scope.go:117] "RemoveContainer" containerID="bfd9719615835450521b7511f861f62cb2bb85abf40f24ff9d817e7a2d193db4" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.364879 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-xttcf"] Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.377038 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c78787df7-xttcf"] Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.396777 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.398642 4750 scope.go:117] "RemoveContainer" containerID="636c0fd3299ccd708561f50ef31954da172919d0ab9e81edc53a06d0698f61da" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.406039 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.413029 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:37 crc kubenswrapper[4750]: E1008 18:30:37.413439 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-log" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.413461 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-log" Oct 08 18:30:37 crc kubenswrapper[4750]: E1008 18:30:37.413473 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a44a1f2-be71-4ae0-b115-9ca979c931be" containerName="dnsmasq-dns" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.413480 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a44a1f2-be71-4ae0-b115-9ca979c931be" containerName="dnsmasq-dns" Oct 08 18:30:37 crc kubenswrapper[4750]: E1008 18:30:37.413502 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a44a1f2-be71-4ae0-b115-9ca979c931be" containerName="init" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.413507 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a44a1f2-be71-4ae0-b115-9ca979c931be" containerName="init" Oct 08 18:30:37 crc kubenswrapper[4750]: E1008 18:30:37.413532 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-httpd" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.413585 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-httpd" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.413776 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-httpd" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.413799 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a44a1f2-be71-4ae0-b115-9ca979c931be" containerName="dnsmasq-dns" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.413813 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" containerName="glance-log" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.414754 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.420278 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.422195 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.431829 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.440701 4750 scope.go:117] "RemoveContainer" containerID="679d39226c41d5cc084ef441560d140a170510ddedb7cca364928eecf7577857" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.510714 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.510770 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.510807 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-logs\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.510838 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.510858 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.510882 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.511034 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxdmc\" (UniqueName: \"kubernetes.io/projected/899027b7-067b-4ce1-a8f1-deaee627aa51-kube-api-access-sxdmc\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.511172 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613451 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-logs\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613514 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613543 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613591 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613633 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxdmc\" (UniqueName: \"kubernetes.io/projected/899027b7-067b-4ce1-a8f1-deaee627aa51-kube-api-access-sxdmc\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613684 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613808 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613847 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.613936 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-logs\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.614124 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.626999 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.628559 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.631717 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.631894 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxdmc\" (UniqueName: \"kubernetes.io/projected/899027b7-067b-4ce1-a8f1-deaee627aa51-kube-api-access-sxdmc\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.633406 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.635146 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.667189 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.741128 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.905533 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:37 crc kubenswrapper[4750]: I1008 18:30:37.994952 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-788b97745d-6snpn" Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.326167 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerStarted","Data":"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353"} Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.326505 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerStarted","Data":"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64"} Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.331330 4750 generic.go:334] "Generic (PLEG): container finished" podID="41591e4c-def9-4152-8bec-7c47ed4367e8" containerID="0cbdb9ccec21a587535b2790946f321f9a1473089f9904fa3e1601f209baa312" exitCode=0 Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.331421 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tqdjh" event={"ID":"41591e4c-def9-4152-8bec-7c47ed4367e8","Type":"ContainerDied","Data":"0cbdb9ccec21a587535b2790946f321f9a1473089f9904fa3e1601f209baa312"} Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.345685 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a25ebe44-c330-48f8-9df7-5f8517cd96bd","Type":"ContainerStarted","Data":"ea10ae3da743eb1f3f0c357c0ff737b1ceb0a0e3e2a2d97d9e6932ea21abcefc"} Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.354162 4750 generic.go:334] "Generic (PLEG): container finished" podID="3ab07eac-5578-43a8-979b-d3dba99ce3ba" containerID="7ea0afff5bdb9f582859dc596ceccf1118ad346f839baedefa4367e2be82fa4a" exitCode=0 Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.355479 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ljvpd" event={"ID":"3ab07eac-5578-43a8-979b-d3dba99ce3ba","Type":"ContainerDied","Data":"7ea0afff5bdb9f582859dc596ceccf1118ad346f839baedefa4367e2be82fa4a"} Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.414301 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.434030 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.704760 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ghwp7" Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.750441 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92mfm\" (UniqueName: \"kubernetes.io/projected/154374c5-fc90-40da-9ac7-a98f99aca0a1-kube-api-access-92mfm\") pod \"154374c5-fc90-40da-9ac7-a98f99aca0a1\" (UID: \"154374c5-fc90-40da-9ac7-a98f99aca0a1\") " Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.761136 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154374c5-fc90-40da-9ac7-a98f99aca0a1-kube-api-access-92mfm" (OuterVolumeSpecName: "kube-api-access-92mfm") pod "154374c5-fc90-40da-9ac7-a98f99aca0a1" (UID: "154374c5-fc90-40da-9ac7-a98f99aca0a1"). InnerVolumeSpecName "kube-api-access-92mfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.779538 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014b5ab1-7e11-42b6-b4ef-c3dba510298f" path="/var/lib/kubelet/pods/014b5ab1-7e11-42b6-b4ef-c3dba510298f/volumes" Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.780535 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a44a1f2-be71-4ae0-b115-9ca979c931be" path="/var/lib/kubelet/pods/5a44a1f2-be71-4ae0-b115-9ca979c931be/volumes" Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.857322 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92mfm\" (UniqueName: \"kubernetes.io/projected/154374c5-fc90-40da-9ac7-a98f99aca0a1-kube-api-access-92mfm\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.914829 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:38 crc kubenswrapper[4750]: I1008 18:30:38.914882 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.370247 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ghwp7" event={"ID":"154374c5-fc90-40da-9ac7-a98f99aca0a1","Type":"ContainerDied","Data":"ca4c1e9f537fd11a511199a2cf3636be5be163f641b1091f8517ba168081ebb1"} Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.370530 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca4c1e9f537fd11a511199a2cf3636be5be163f641b1091f8517ba168081ebb1" Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.370442 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ghwp7" Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.379644 4750 generic.go:334] "Generic (PLEG): container finished" podID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerID="5e638894504edd794d7ce2c06f083eaa1e1b56d38dfb14d325c0dcf4b159293a" exitCode=0 Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.379717 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"173266a7-3201-41b0-bf79-72d53fd66c2a","Type":"ContainerDied","Data":"5e638894504edd794d7ce2c06f083eaa1e1b56d38dfb14d325c0dcf4b159293a"} Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.386088 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"899027b7-067b-4ce1-a8f1-deaee627aa51","Type":"ContainerStarted","Data":"f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9"} Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.386131 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"899027b7-067b-4ce1-a8f1-deaee627aa51","Type":"ContainerStarted","Data":"a38418e1c7a283b12f176b73e0375e2e95ef5f7c49d26db91f928f456ea00d69"} Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.397399 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a25ebe44-c330-48f8-9df7-5f8517cd96bd","Type":"ContainerStarted","Data":"e8cbb35efffa0dfc5317921463dc9d7e30007f7e471dd1e64c7dfbb694f8a5d9"} Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.424569 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.424533783 podStartE2EDuration="8.424533783s" podCreationTimestamp="2025-10-08 18:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:39.419653004 +0000 UTC m=+1195.332624017" watchObservedRunningTime="2025-10-08 18:30:39.424533783 +0000 UTC m=+1195.337504796" Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.781985 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tqdjh" Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.878884 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdnmx\" (UniqueName: \"kubernetes.io/projected/41591e4c-def9-4152-8bec-7c47ed4367e8-kube-api-access-hdnmx\") pod \"41591e4c-def9-4152-8bec-7c47ed4367e8\" (UID: \"41591e4c-def9-4152-8bec-7c47ed4367e8\") " Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.888192 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41591e4c-def9-4152-8bec-7c47ed4367e8-kube-api-access-hdnmx" (OuterVolumeSpecName: "kube-api-access-hdnmx") pod "41591e4c-def9-4152-8bec-7c47ed4367e8" (UID: "41591e4c-def9-4152-8bec-7c47ed4367e8"). InnerVolumeSpecName "kube-api-access-hdnmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.975532 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 18:30:39 crc kubenswrapper[4750]: I1008 18:30:39.982119 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdnmx\" (UniqueName: \"kubernetes.io/projected/41591e4c-def9-4152-8bec-7c47ed4367e8-kube-api-access-hdnmx\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.083235 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-combined-ca-bundle\") pod \"173266a7-3201-41b0-bf79-72d53fd66c2a\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.083369 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-scripts\") pod \"173266a7-3201-41b0-bf79-72d53fd66c2a\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.083438 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data-custom\") pod \"173266a7-3201-41b0-bf79-72d53fd66c2a\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.083540 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/173266a7-3201-41b0-bf79-72d53fd66c2a-etc-machine-id\") pod \"173266a7-3201-41b0-bf79-72d53fd66c2a\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.083599 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk6r4\" (UniqueName: \"kubernetes.io/projected/173266a7-3201-41b0-bf79-72d53fd66c2a-kube-api-access-kk6r4\") pod \"173266a7-3201-41b0-bf79-72d53fd66c2a\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.083682 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data\") pod \"173266a7-3201-41b0-bf79-72d53fd66c2a\" (UID: \"173266a7-3201-41b0-bf79-72d53fd66c2a\") " Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.085653 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/173266a7-3201-41b0-bf79-72d53fd66c2a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "173266a7-3201-41b0-bf79-72d53fd66c2a" (UID: "173266a7-3201-41b0-bf79-72d53fd66c2a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.087932 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "173266a7-3201-41b0-bf79-72d53fd66c2a" (UID: "173266a7-3201-41b0-bf79-72d53fd66c2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.088452 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ljvpd" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.090265 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173266a7-3201-41b0-bf79-72d53fd66c2a-kube-api-access-kk6r4" (OuterVolumeSpecName: "kube-api-access-kk6r4") pod "173266a7-3201-41b0-bf79-72d53fd66c2a" (UID: "173266a7-3201-41b0-bf79-72d53fd66c2a"). InnerVolumeSpecName "kube-api-access-kk6r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.093636 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-scripts" (OuterVolumeSpecName: "scripts") pod "173266a7-3201-41b0-bf79-72d53fd66c2a" (UID: "173266a7-3201-41b0-bf79-72d53fd66c2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.149664 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "173266a7-3201-41b0-bf79-72d53fd66c2a" (UID: "173266a7-3201-41b0-bf79-72d53fd66c2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.185058 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fcdk\" (UniqueName: \"kubernetes.io/projected/3ab07eac-5578-43a8-979b-d3dba99ce3ba-kube-api-access-9fcdk\") pod \"3ab07eac-5578-43a8-979b-d3dba99ce3ba\" (UID: \"3ab07eac-5578-43a8-979b-d3dba99ce3ba\") " Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.185699 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.185719 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.185731 4750 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/173266a7-3201-41b0-bf79-72d53fd66c2a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.185739 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk6r4\" (UniqueName: \"kubernetes.io/projected/173266a7-3201-41b0-bf79-72d53fd66c2a-kube-api-access-kk6r4\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.185747 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.188187 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab07eac-5578-43a8-979b-d3dba99ce3ba-kube-api-access-9fcdk" (OuterVolumeSpecName: "kube-api-access-9fcdk") pod "3ab07eac-5578-43a8-979b-d3dba99ce3ba" (UID: "3ab07eac-5578-43a8-979b-d3dba99ce3ba"). InnerVolumeSpecName "kube-api-access-9fcdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.188685 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data" (OuterVolumeSpecName: "config-data") pod "173266a7-3201-41b0-bf79-72d53fd66c2a" (UID: "173266a7-3201-41b0-bf79-72d53fd66c2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.287106 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fcdk\" (UniqueName: \"kubernetes.io/projected/3ab07eac-5578-43a8-979b-d3dba99ce3ba-kube-api-access-9fcdk\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.287420 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173266a7-3201-41b0-bf79-72d53fd66c2a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.409531 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerStarted","Data":"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702"} Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.409733 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="ceilometer-central-agent" containerID="cri-o://36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869" gracePeriod=30 Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.409963 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.410200 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="proxy-httpd" containerID="cri-o://946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702" gracePeriod=30 Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.410247 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="sg-core" containerID="cri-o://dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353" gracePeriod=30 Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.410280 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="ceilometer-notification-agent" containerID="cri-o://692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64" gracePeriod=30 Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.412815 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tqdjh" event={"ID":"41591e4c-def9-4152-8bec-7c47ed4367e8","Type":"ContainerDied","Data":"7b203fa29d56c786fbbfd77e88e6cc33a3546e496994e2ee4124b4307f75912d"} Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.412850 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b203fa29d56c786fbbfd77e88e6cc33a3546e496994e2ee4124b4307f75912d" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.412909 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tqdjh" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.420171 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"173266a7-3201-41b0-bf79-72d53fd66c2a","Type":"ContainerDied","Data":"1d3fe6e41369e7f9f0fc7b0b8f076b10d7a03b3c78a7e88364bf72a098630996"} Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.420213 4750 scope.go:117] "RemoveContainer" containerID="f4e83d2e05f58b0eee54ea56e59dc9b3132704a6e200a10d9b647d4a0023fb44" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.420319 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.432611 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"899027b7-067b-4ce1-a8f1-deaee627aa51","Type":"ContainerStarted","Data":"3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227"} Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.441053 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ljvpd" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.441309 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ljvpd" event={"ID":"3ab07eac-5578-43a8-979b-d3dba99ce3ba","Type":"ContainerDied","Data":"56ab38b457cc7f771d906c9e6cff3a39178244aabf9223b3b03ef9d5905e5729"} Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.441341 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56ab38b457cc7f771d906c9e6cff3a39178244aabf9223b3b03ef9d5905e5729" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.441689 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.133667179 podStartE2EDuration="11.441679914s" podCreationTimestamp="2025-10-08 18:30:29 +0000 UTC" firstStartedPulling="2025-10-08 18:30:30.130184623 +0000 UTC m=+1186.043155636" lastFinishedPulling="2025-10-08 18:30:39.438197358 +0000 UTC m=+1195.351168371" observedRunningTime="2025-10-08 18:30:40.435600585 +0000 UTC m=+1196.348571618" watchObservedRunningTime="2025-10-08 18:30:40.441679914 +0000 UTC m=+1196.354650927" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.499175 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.499153522 podStartE2EDuration="3.499153522s" podCreationTimestamp="2025-10-08 18:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:40.476656221 +0000 UTC m=+1196.389627234" watchObservedRunningTime="2025-10-08 18:30:40.499153522 +0000 UTC m=+1196.412124535" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.501021 4750 scope.go:117] "RemoveContainer" containerID="5e638894504edd794d7ce2c06f083eaa1e1b56d38dfb14d325c0dcf4b159293a" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.517879 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.530847 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.548584 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:40 crc kubenswrapper[4750]: E1008 18:30:40.549038 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154374c5-fc90-40da-9ac7-a98f99aca0a1" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549054 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="154374c5-fc90-40da-9ac7-a98f99aca0a1" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: E1008 18:30:40.549066 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerName="cinder-scheduler" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549074 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerName="cinder-scheduler" Oct 08 18:30:40 crc kubenswrapper[4750]: E1008 18:30:40.549083 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab07eac-5578-43a8-979b-d3dba99ce3ba" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549094 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab07eac-5578-43a8-979b-d3dba99ce3ba" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: E1008 18:30:40.549124 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerName="probe" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549131 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerName="probe" Oct 08 18:30:40 crc kubenswrapper[4750]: E1008 18:30:40.549146 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41591e4c-def9-4152-8bec-7c47ed4367e8" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549153 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="41591e4c-def9-4152-8bec-7c47ed4367e8" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549367 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerName="cinder-scheduler" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549381 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" containerName="probe" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549394 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab07eac-5578-43a8-979b-d3dba99ce3ba" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549409 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="41591e4c-def9-4152-8bec-7c47ed4367e8" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.549425 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="154374c5-fc90-40da-9ac7-a98f99aca0a1" containerName="mariadb-database-create" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.554823 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.558628 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.568099 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.693501 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-scripts\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.693608 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60379ea9-0750-4de0-9d3b-13af080eea8f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.693671 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.693743 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.693845 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.693916 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7hzp\" (UniqueName: \"kubernetes.io/projected/60379ea9-0750-4de0-9d3b-13af080eea8f-kube-api-access-l7hzp\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.747361 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173266a7-3201-41b0-bf79-72d53fd66c2a" path="/var/lib/kubelet/pods/173266a7-3201-41b0-bf79-72d53fd66c2a/volumes" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.795334 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.795385 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7hzp\" (UniqueName: \"kubernetes.io/projected/60379ea9-0750-4de0-9d3b-13af080eea8f-kube-api-access-l7hzp\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.795460 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-scripts\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.795493 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60379ea9-0750-4de0-9d3b-13af080eea8f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.795524 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.795543 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.796235 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60379ea9-0750-4de0-9d3b-13af080eea8f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.800884 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-scripts\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.801001 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.801152 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.812203 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.816234 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7hzp\" (UniqueName: \"kubernetes.io/projected/60379ea9-0750-4de0-9d3b-13af080eea8f-kube-api-access-l7hzp\") pod \"cinder-scheduler-0\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " pod="openstack/cinder-scheduler-0" Oct 08 18:30:40 crc kubenswrapper[4750]: I1008 18:30:40.883713 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.091919 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d8db-account-create-hswqx"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.093199 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d8db-account-create-hswqx" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.095784 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.111851 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d8db-account-create-hswqx"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.204029 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnrw4\" (UniqueName: \"kubernetes.io/projected/5253a72b-fa57-4150-92f0-0d6172aca7f0-kube-api-access-gnrw4\") pod \"nova-api-d8db-account-create-hswqx\" (UID: \"5253a72b-fa57-4150-92f0-0d6172aca7f0\") " pod="openstack/nova-api-d8db-account-create-hswqx" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.292648 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a9e2-account-create-btj9m"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.305509 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnrw4\" (UniqueName: \"kubernetes.io/projected/5253a72b-fa57-4150-92f0-0d6172aca7f0-kube-api-access-gnrw4\") pod \"nova-api-d8db-account-create-hswqx\" (UID: \"5253a72b-fa57-4150-92f0-0d6172aca7f0\") " pod="openstack/nova-api-d8db-account-create-hswqx" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.308185 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a9e2-account-create-btj9m" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.309982 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.320265 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a9e2-account-create-btj9m"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.329249 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnrw4\" (UniqueName: \"kubernetes.io/projected/5253a72b-fa57-4150-92f0-0d6172aca7f0-kube-api-access-gnrw4\") pod \"nova-api-d8db-account-create-hswqx\" (UID: \"5253a72b-fa57-4150-92f0-0d6172aca7f0\") " pod="openstack/nova-api-d8db-account-create-hswqx" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.344872 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.406280 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-sg-core-conf-yaml\") pod \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.406450 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-config-data\") pod \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.406498 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldjfq\" (UniqueName: \"kubernetes.io/projected/69942e34-56b9-44e8-a3ea-025aaef2bcb7-kube-api-access-ldjfq\") pod \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.406617 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-run-httpd\") pod \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.406665 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-combined-ca-bundle\") pod \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.406686 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-scripts\") pod \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.406718 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-log-httpd\") pod \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\" (UID: \"69942e34-56b9-44e8-a3ea-025aaef2bcb7\") " Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.406939 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rppks\" (UniqueName: \"kubernetes.io/projected/099005c9-52af-4489-9103-d6c82b1c82b2-kube-api-access-rppks\") pod \"nova-cell0-a9e2-account-create-btj9m\" (UID: \"099005c9-52af-4489-9103-d6c82b1c82b2\") " pod="openstack/nova-cell0-a9e2-account-create-btj9m" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.407061 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69942e34-56b9-44e8-a3ea-025aaef2bcb7" (UID: "69942e34-56b9-44e8-a3ea-025aaef2bcb7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.407430 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69942e34-56b9-44e8-a3ea-025aaef2bcb7" (UID: "69942e34-56b9-44e8-a3ea-025aaef2bcb7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.410540 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-scripts" (OuterVolumeSpecName: "scripts") pod "69942e34-56b9-44e8-a3ea-025aaef2bcb7" (UID: "69942e34-56b9-44e8-a3ea-025aaef2bcb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.410833 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69942e34-56b9-44e8-a3ea-025aaef2bcb7-kube-api-access-ldjfq" (OuterVolumeSpecName: "kube-api-access-ldjfq") pod "69942e34-56b9-44e8-a3ea-025aaef2bcb7" (UID: "69942e34-56b9-44e8-a3ea-025aaef2bcb7"). InnerVolumeSpecName "kube-api-access-ldjfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.444863 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69942e34-56b9-44e8-a3ea-025aaef2bcb7" (UID: "69942e34-56b9-44e8-a3ea-025aaef2bcb7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455637 4750 generic.go:334] "Generic (PLEG): container finished" podID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerID="946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702" exitCode=0 Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455678 4750 generic.go:334] "Generic (PLEG): container finished" podID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerID="dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353" exitCode=2 Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455687 4750 generic.go:334] "Generic (PLEG): container finished" podID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerID="692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64" exitCode=0 Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455694 4750 generic.go:334] "Generic (PLEG): container finished" podID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerID="36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869" exitCode=0 Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455798 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455825 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerDied","Data":"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702"} Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455907 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerDied","Data":"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353"} Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455950 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerDied","Data":"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64"} Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455965 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerDied","Data":"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869"} Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.455979 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69942e34-56b9-44e8-a3ea-025aaef2bcb7","Type":"ContainerDied","Data":"4e3a3c3bd972dc06625721a5713b1696557110f3b25a44dd0c3949447f86e466"} Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.456023 4750 scope.go:117] "RemoveContainer" containerID="946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.463891 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d8db-account-create-hswqx" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.497688 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-88ae-account-create-dzf7x"] Oct 08 18:30:41 crc kubenswrapper[4750]: E1008 18:30:41.498123 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="ceilometer-notification-agent" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.498140 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="ceilometer-notification-agent" Oct 08 18:30:41 crc kubenswrapper[4750]: E1008 18:30:41.498228 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="proxy-httpd" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.498239 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="proxy-httpd" Oct 08 18:30:41 crc kubenswrapper[4750]: E1008 18:30:41.498264 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="ceilometer-central-agent" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.498272 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="ceilometer-central-agent" Oct 08 18:30:41 crc kubenswrapper[4750]: E1008 18:30:41.498293 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="sg-core" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.498300 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="sg-core" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.498526 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="sg-core" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.498537 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="ceilometer-notification-agent" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.498571 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="proxy-httpd" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.498595 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" containerName="ceilometer-central-agent" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.499285 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88ae-account-create-dzf7x" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.502384 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.505191 4750 scope.go:117] "RemoveContainer" containerID="dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.509916 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rppks\" (UniqueName: \"kubernetes.io/projected/099005c9-52af-4489-9103-d6c82b1c82b2-kube-api-access-rppks\") pod \"nova-cell0-a9e2-account-create-btj9m\" (UID: \"099005c9-52af-4489-9103-d6c82b1c82b2\") " pod="openstack/nova-cell0-a9e2-account-create-btj9m" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.510071 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.510096 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.510108 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69942e34-56b9-44e8-a3ea-025aaef2bcb7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.510252 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.510268 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldjfq\" (UniqueName: \"kubernetes.io/projected/69942e34-56b9-44e8-a3ea-025aaef2bcb7-kube-api-access-ldjfq\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.520363 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.530587 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88ae-account-create-dzf7x"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.532713 4750 scope.go:117] "RemoveContainer" containerID="692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.542828 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rppks\" (UniqueName: \"kubernetes.io/projected/099005c9-52af-4489-9103-d6c82b1c82b2-kube-api-access-rppks\") pod \"nova-cell0-a9e2-account-create-btj9m\" (UID: \"099005c9-52af-4489-9103-d6c82b1c82b2\") " pod="openstack/nova-cell0-a9e2-account-create-btj9m" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.557404 4750 scope.go:117] "RemoveContainer" containerID="36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.562654 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69942e34-56b9-44e8-a3ea-025aaef2bcb7" (UID: "69942e34-56b9-44e8-a3ea-025aaef2bcb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.574379 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-config-data" (OuterVolumeSpecName: "config-data") pod "69942e34-56b9-44e8-a3ea-025aaef2bcb7" (UID: "69942e34-56b9-44e8-a3ea-025aaef2bcb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.586608 4750 scope.go:117] "RemoveContainer" containerID="946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702" Oct 08 18:30:41 crc kubenswrapper[4750]: E1008 18:30:41.610772 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": container with ID starting with 946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702 not found: ID does not exist" containerID="946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.610962 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702"} err="failed to get container status \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": rpc error: code = NotFound desc = could not find container \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": container with ID starting with 946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.610992 4750 scope.go:117] "RemoveContainer" containerID="dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.612135 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbm7r\" (UniqueName: \"kubernetes.io/projected/e175b5f7-2b09-423e-8492-7802da4e1ec1-kube-api-access-lbm7r\") pod \"nova-cell1-88ae-account-create-dzf7x\" (UID: \"e175b5f7-2b09-423e-8492-7802da4e1ec1\") " pod="openstack/nova-cell1-88ae-account-create-dzf7x" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.612258 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.612269 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69942e34-56b9-44e8-a3ea-025aaef2bcb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:41 crc kubenswrapper[4750]: E1008 18:30:41.619806 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": container with ID starting with dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353 not found: ID does not exist" containerID="dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.619855 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353"} err="failed to get container status \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": rpc error: code = NotFound desc = could not find container \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": container with ID starting with dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.619888 4750 scope.go:117] "RemoveContainer" containerID="692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64" Oct 08 18:30:41 crc kubenswrapper[4750]: E1008 18:30:41.624293 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": container with ID starting with 692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64 not found: ID does not exist" containerID="692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.624321 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64"} err="failed to get container status \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": rpc error: code = NotFound desc = could not find container \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": container with ID starting with 692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.624340 4750 scope.go:117] "RemoveContainer" containerID="36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869" Oct 08 18:30:41 crc kubenswrapper[4750]: E1008 18:30:41.633298 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": container with ID starting with 36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869 not found: ID does not exist" containerID="36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.633342 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869"} err="failed to get container status \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": rpc error: code = NotFound desc = could not find container \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": container with ID starting with 36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.633395 4750 scope.go:117] "RemoveContainer" containerID="946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.633698 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702"} err="failed to get container status \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": rpc error: code = NotFound desc = could not find container \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": container with ID starting with 946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.633719 4750 scope.go:117] "RemoveContainer" containerID="dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.634358 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353"} err="failed to get container status \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": rpc error: code = NotFound desc = could not find container \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": container with ID starting with dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.634402 4750 scope.go:117] "RemoveContainer" containerID="692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.634659 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64"} err="failed to get container status \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": rpc error: code = NotFound desc = could not find container \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": container with ID starting with 692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.634679 4750 scope.go:117] "RemoveContainer" containerID="36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.634921 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869"} err="failed to get container status \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": rpc error: code = NotFound desc = could not find container \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": container with ID starting with 36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.634941 4750 scope.go:117] "RemoveContainer" containerID="946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.635191 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702"} err="failed to get container status \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": rpc error: code = NotFound desc = could not find container \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": container with ID starting with 946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.635229 4750 scope.go:117] "RemoveContainer" containerID="dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.635428 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353"} err="failed to get container status \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": rpc error: code = NotFound desc = could not find container \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": container with ID starting with dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.635462 4750 scope.go:117] "RemoveContainer" containerID="692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.635758 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64"} err="failed to get container status \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": rpc error: code = NotFound desc = could not find container \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": container with ID starting with 692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.635776 4750 scope.go:117] "RemoveContainer" containerID="36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.635964 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869"} err="failed to get container status \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": rpc error: code = NotFound desc = could not find container \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": container with ID starting with 36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.635977 4750 scope.go:117] "RemoveContainer" containerID="946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.636118 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702"} err="failed to get container status \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": rpc error: code = NotFound desc = could not find container \"946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702\": container with ID starting with 946061db579d7ba5d01bb265ffbd09e5ae31c00a58ff36a5e266a46a97f74702 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.636130 4750 scope.go:117] "RemoveContainer" containerID="dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.636385 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353"} err="failed to get container status \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": rpc error: code = NotFound desc = could not find container \"dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353\": container with ID starting with dfd8fba95d457ec8fa669bbd02e8f82fa9b760dc5a21b1df87e99eab486aa353 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.636398 4750 scope.go:117] "RemoveContainer" containerID="692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.636636 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64"} err="failed to get container status \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": rpc error: code = NotFound desc = could not find container \"692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64\": container with ID starting with 692f961adfb1fcda00379d67765cfe93b588b9cc8e69fee04b9c31dd0fbb8e64 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.636649 4750 scope.go:117] "RemoveContainer" containerID="36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.636900 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869"} err="failed to get container status \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": rpc error: code = NotFound desc = could not find container \"36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869\": container with ID starting with 36d30f06376d1697987c938239f416b0cf77e8cae26c1a9050e9ce70057d0869 not found: ID does not exist" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.651305 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a9e2-account-create-btj9m" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.713439 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbm7r\" (UniqueName: \"kubernetes.io/projected/e175b5f7-2b09-423e-8492-7802da4e1ec1-kube-api-access-lbm7r\") pod \"nova-cell1-88ae-account-create-dzf7x\" (UID: \"e175b5f7-2b09-423e-8492-7802da4e1ec1\") " pod="openstack/nova-cell1-88ae-account-create-dzf7x" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.737941 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbm7r\" (UniqueName: \"kubernetes.io/projected/e175b5f7-2b09-423e-8492-7802da4e1ec1-kube-api-access-lbm7r\") pod \"nova-cell1-88ae-account-create-dzf7x\" (UID: \"e175b5f7-2b09-423e-8492-7802da4e1ec1\") " pod="openstack/nova-cell1-88ae-account-create-dzf7x" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.798751 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.810571 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.831716 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.835662 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.837282 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88ae-account-create-dzf7x" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.837598 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.837880 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.854215 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.892283 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.892324 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.963288 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 18:30:41 crc kubenswrapper[4750]: I1008 18:30:41.966981 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d8db-account-create-hswqx"] Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.019771 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.019866 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-run-httpd\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.019892 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-config-data\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.019955 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-scripts\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.020032 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.020064 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-log-httpd\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.020078 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tp2\" (UniqueName: \"kubernetes.io/projected/9415f0f2-766a-4cde-aecb-b0d89cf8326e-kube-api-access-k5tp2\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.026215 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.121471 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.121764 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-log-httpd\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.121785 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tp2\" (UniqueName: \"kubernetes.io/projected/9415f0f2-766a-4cde-aecb-b0d89cf8326e-kube-api-access-k5tp2\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.121811 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.121849 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-run-httpd\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.121869 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-config-data\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.121925 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-scripts\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.122299 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-log-httpd\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.122610 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-run-httpd\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.127400 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-scripts\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.127544 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.129205 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.131563 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-config-data\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.141322 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5tp2\" (UniqueName: \"kubernetes.io/projected/9415f0f2-766a-4cde-aecb-b0d89cf8326e-kube-api-access-k5tp2\") pod \"ceilometer-0\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.167628 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.217118 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a9e2-account-create-btj9m"] Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.452624 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88ae-account-create-dzf7x"] Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.481157 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60379ea9-0750-4de0-9d3b-13af080eea8f","Type":"ContainerStarted","Data":"531a039dc690be8ef20b0b6f5062a698798a803816f610a232671935cec2a8cc"} Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.481223 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60379ea9-0750-4de0-9d3b-13af080eea8f","Type":"ContainerStarted","Data":"9206ef57f28e2ee895a8e287047555fefbdb8ec79245d11843edf6b4acf69881"} Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.483163 4750 generic.go:334] "Generic (PLEG): container finished" podID="5253a72b-fa57-4150-92f0-0d6172aca7f0" containerID="e3598e5f4e24bd3d7dfa1cc40e2b04f8ead19e68ba96e9840955fd5275a178f5" exitCode=0 Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.483240 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d8db-account-create-hswqx" event={"ID":"5253a72b-fa57-4150-92f0-0d6172aca7f0","Type":"ContainerDied","Data":"e3598e5f4e24bd3d7dfa1cc40e2b04f8ead19e68ba96e9840955fd5275a178f5"} Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.483262 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d8db-account-create-hswqx" event={"ID":"5253a72b-fa57-4150-92f0-0d6172aca7f0","Type":"ContainerStarted","Data":"e4db672350164453b044aca6f19bb44ff5b41368f6dc0a2aa7918b100bf3d9a6"} Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.485315 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a9e2-account-create-btj9m" event={"ID":"099005c9-52af-4489-9103-d6c82b1c82b2","Type":"ContainerStarted","Data":"7e63b792c72e898e6ca6bd94a48e607c11b60f66e6e14588b86b0aefe59acfac"} Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.528279 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-a9e2-account-create-btj9m" podStartSLOduration=1.528262057 podStartE2EDuration="1.528262057s" podCreationTimestamp="2025-10-08 18:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:42.528021832 +0000 UTC m=+1198.440992845" watchObservedRunningTime="2025-10-08 18:30:42.528262057 +0000 UTC m=+1198.441233070" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.530024 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.530055 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.725676 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:42 crc kubenswrapper[4750]: I1008 18:30:42.749895 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69942e34-56b9-44e8-a3ea-025aaef2bcb7" path="/var/lib/kubelet/pods/69942e34-56b9-44e8-a3ea-025aaef2bcb7/volumes" Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.383048 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.541716 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60379ea9-0750-4de0-9d3b-13af080eea8f","Type":"ContainerStarted","Data":"be50775e9b6d25d8067aa8220a05db2eb7a3fed17a5981e8b008927ca24a65a9"} Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.543798 4750 generic.go:334] "Generic (PLEG): container finished" podID="e175b5f7-2b09-423e-8492-7802da4e1ec1" containerID="4a872e638239e5f6665265383ac7fe690b7080fde4880ee3d5dcdc9e1bf48f36" exitCode=0 Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.543861 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88ae-account-create-dzf7x" event={"ID":"e175b5f7-2b09-423e-8492-7802da4e1ec1","Type":"ContainerDied","Data":"4a872e638239e5f6665265383ac7fe690b7080fde4880ee3d5dcdc9e1bf48f36"} Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.543884 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88ae-account-create-dzf7x" event={"ID":"e175b5f7-2b09-423e-8492-7802da4e1ec1","Type":"ContainerStarted","Data":"250c539c0919ef5f759dc9858470afb49073cc93a153dfaa5b5e2f5c1031f573"} Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.547895 4750 generic.go:334] "Generic (PLEG): container finished" podID="099005c9-52af-4489-9103-d6c82b1c82b2" containerID="502688bbe286affcf30f98fc226d9553cd6032ff06bd5cff82113e535ec1397d" exitCode=0 Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.547957 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a9e2-account-create-btj9m" event={"ID":"099005c9-52af-4489-9103-d6c82b1c82b2","Type":"ContainerDied","Data":"502688bbe286affcf30f98fc226d9553cd6032ff06bd5cff82113e535ec1397d"} Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.557363 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerStarted","Data":"ff383390c02bd88ed2cece70660933293f84fa0c7cfed4457fb8de26e6ad96f7"} Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.557412 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerStarted","Data":"cda02b7fa5ae7839f8c47abfea351641171175754f69a4e7ab8ebcc8350e4b3b"} Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.561988 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.561973064 podStartE2EDuration="3.561973064s" podCreationTimestamp="2025-10-08 18:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:30:43.558033818 +0000 UTC m=+1199.471004831" watchObservedRunningTime="2025-10-08 18:30:43.561973064 +0000 UTC m=+1199.474944077" Oct 08 18:30:43 crc kubenswrapper[4750]: I1008 18:30:43.988107 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d8db-account-create-hswqx" Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.159138 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnrw4\" (UniqueName: \"kubernetes.io/projected/5253a72b-fa57-4150-92f0-0d6172aca7f0-kube-api-access-gnrw4\") pod \"5253a72b-fa57-4150-92f0-0d6172aca7f0\" (UID: \"5253a72b-fa57-4150-92f0-0d6172aca7f0\") " Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.165725 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5253a72b-fa57-4150-92f0-0d6172aca7f0-kube-api-access-gnrw4" (OuterVolumeSpecName: "kube-api-access-gnrw4") pod "5253a72b-fa57-4150-92f0-0d6172aca7f0" (UID: "5253a72b-fa57-4150-92f0-0d6172aca7f0"). InnerVolumeSpecName "kube-api-access-gnrw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.262932 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnrw4\" (UniqueName: \"kubernetes.io/projected/5253a72b-fa57-4150-92f0-0d6172aca7f0-kube-api-access-gnrw4\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.569805 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d8db-account-create-hswqx" event={"ID":"5253a72b-fa57-4150-92f0-0d6172aca7f0","Type":"ContainerDied","Data":"e4db672350164453b044aca6f19bb44ff5b41368f6dc0a2aa7918b100bf3d9a6"} Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.570123 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4db672350164453b044aca6f19bb44ff5b41368f6dc0a2aa7918b100bf3d9a6" Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.569816 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d8db-account-create-hswqx" Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.571887 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.571915 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerStarted","Data":"74cb7426c0f70b5ec395901e6a58475a18c8ff591fb31ded8277938dff088679"} Oct 08 18:30:44 crc kubenswrapper[4750]: I1008 18:30:44.789850 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.141943 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88ae-account-create-dzf7x" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.149356 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a9e2-account-create-btj9m" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.306396 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbm7r\" (UniqueName: \"kubernetes.io/projected/e175b5f7-2b09-423e-8492-7802da4e1ec1-kube-api-access-lbm7r\") pod \"e175b5f7-2b09-423e-8492-7802da4e1ec1\" (UID: \"e175b5f7-2b09-423e-8492-7802da4e1ec1\") " Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.306444 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rppks\" (UniqueName: \"kubernetes.io/projected/099005c9-52af-4489-9103-d6c82b1c82b2-kube-api-access-rppks\") pod \"099005c9-52af-4489-9103-d6c82b1c82b2\" (UID: \"099005c9-52af-4489-9103-d6c82b1c82b2\") " Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.311795 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099005c9-52af-4489-9103-d6c82b1c82b2-kube-api-access-rppks" (OuterVolumeSpecName: "kube-api-access-rppks") pod "099005c9-52af-4489-9103-d6c82b1c82b2" (UID: "099005c9-52af-4489-9103-d6c82b1c82b2"). InnerVolumeSpecName "kube-api-access-rppks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.311885 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e175b5f7-2b09-423e-8492-7802da4e1ec1-kube-api-access-lbm7r" (OuterVolumeSpecName: "kube-api-access-lbm7r") pod "e175b5f7-2b09-423e-8492-7802da4e1ec1" (UID: "e175b5f7-2b09-423e-8492-7802da4e1ec1"). InnerVolumeSpecName "kube-api-access-lbm7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.408531 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbm7r\" (UniqueName: \"kubernetes.io/projected/e175b5f7-2b09-423e-8492-7802da4e1ec1-kube-api-access-lbm7r\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.408600 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rppks\" (UniqueName: \"kubernetes.io/projected/099005c9-52af-4489-9103-d6c82b1c82b2-kube-api-access-rppks\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.463670 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.581704 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88ae-account-create-dzf7x" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.581704 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88ae-account-create-dzf7x" event={"ID":"e175b5f7-2b09-423e-8492-7802da4e1ec1","Type":"ContainerDied","Data":"250c539c0919ef5f759dc9858470afb49073cc93a153dfaa5b5e2f5c1031f573"} Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.581809 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250c539c0919ef5f759dc9858470afb49073cc93a153dfaa5b5e2f5c1031f573" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.584746 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a9e2-account-create-btj9m" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.584748 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a9e2-account-create-btj9m" event={"ID":"099005c9-52af-4489-9103-d6c82b1c82b2","Type":"ContainerDied","Data":"7e63b792c72e898e6ca6bd94a48e607c11b60f66e6e14588b86b0aefe59acfac"} Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.584864 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e63b792c72e898e6ca6bd94a48e607c11b60f66e6e14588b86b0aefe59acfac" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.588615 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.588616 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerStarted","Data":"1407f1071110e4b3c4ee405aa9bfca8a5d4abf60a64a0f3113661db2e1f9f23c"} Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.884850 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.902071 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 18:30:45 crc kubenswrapper[4750]: I1008 18:30:45.973863 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.064204 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c5ddbcd5b-g9rd8"] Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.064414 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c5ddbcd5b-g9rd8" podUID="42135b66-6641-4c85-9958-ae210a3de33f" containerName="neutron-api" containerID="cri-o://eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723" gracePeriod=30 Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.064789 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c5ddbcd5b-g9rd8" podUID="42135b66-6641-4c85-9958-ae210a3de33f" containerName="neutron-httpd" containerID="cri-o://52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6" gracePeriod=30 Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.549765 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9n56l"] Oct 08 18:30:46 crc kubenswrapper[4750]: E1008 18:30:46.550497 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099005c9-52af-4489-9103-d6c82b1c82b2" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.550520 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="099005c9-52af-4489-9103-d6c82b1c82b2" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: E1008 18:30:46.550534 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5253a72b-fa57-4150-92f0-0d6172aca7f0" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.550541 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5253a72b-fa57-4150-92f0-0d6172aca7f0" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: E1008 18:30:46.552007 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e175b5f7-2b09-423e-8492-7802da4e1ec1" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.552025 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e175b5f7-2b09-423e-8492-7802da4e1ec1" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.553435 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e175b5f7-2b09-423e-8492-7802da4e1ec1" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.553483 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="099005c9-52af-4489-9103-d6c82b1c82b2" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.553507 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5253a72b-fa57-4150-92f0-0d6172aca7f0" containerName="mariadb-account-create" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.555708 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.559967 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.560257 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-b8qmk" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.562079 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.591836 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9n56l"] Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.613998 4750 generic.go:334] "Generic (PLEG): container finished" podID="42135b66-6641-4c85-9958-ae210a3de33f" containerID="52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6" exitCode=0 Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.614899 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c5ddbcd5b-g9rd8" event={"ID":"42135b66-6641-4c85-9958-ae210a3de33f","Type":"ContainerDied","Data":"52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6"} Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.630716 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.630802 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbwp\" (UniqueName: \"kubernetes.io/projected/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-kube-api-access-6nbwp\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.630878 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-config-data\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.630905 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-scripts\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.732565 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-scripts\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.732726 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.732783 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nbwp\" (UniqueName: \"kubernetes.io/projected/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-kube-api-access-6nbwp\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.732852 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-config-data\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.737676 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-config-data\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.738046 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.752123 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-scripts\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.756411 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nbwp\" (UniqueName: \"kubernetes.io/projected/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-kube-api-access-6nbwp\") pod \"nova-cell0-conductor-db-sync-9n56l\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:46 crc kubenswrapper[4750]: I1008 18:30:46.907834 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.419659 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9n56l"] Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.626623 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9n56l" event={"ID":"65d622c4-c0ae-4cd3-bdab-101eb0783cc3","Type":"ContainerStarted","Data":"92ddb6487bee146bf7d9f9629d471f4f733ce9733f97533459f6d699d5a543d9"} Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.629705 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerStarted","Data":"30a104a722ddd8380e531e4665c437eea755c67a159a411d8f4f066da5ad1019"} Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.629895 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="ceilometer-central-agent" containerID="cri-o://ff383390c02bd88ed2cece70660933293f84fa0c7cfed4457fb8de26e6ad96f7" gracePeriod=30 Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.629950 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.629994 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="proxy-httpd" containerID="cri-o://30a104a722ddd8380e531e4665c437eea755c67a159a411d8f4f066da5ad1019" gracePeriod=30 Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.630030 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="ceilometer-notification-agent" containerID="cri-o://74cb7426c0f70b5ec395901e6a58475a18c8ff591fb31ded8277938dff088679" gracePeriod=30 Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.630080 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="sg-core" containerID="cri-o://1407f1071110e4b3c4ee405aa9bfca8a5d4abf60a64a0f3113661db2e1f9f23c" gracePeriod=30 Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.658910 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.931929699 podStartE2EDuration="6.658891452s" podCreationTimestamp="2025-10-08 18:30:41 +0000 UTC" firstStartedPulling="2025-10-08 18:30:42.726423683 +0000 UTC m=+1198.639394696" lastFinishedPulling="2025-10-08 18:30:46.453385436 +0000 UTC m=+1202.366356449" observedRunningTime="2025-10-08 18:30:47.650108787 +0000 UTC m=+1203.563079810" watchObservedRunningTime="2025-10-08 18:30:47.658891452 +0000 UTC m=+1203.571862465" Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.747031 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.747089 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.780493 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:47 crc kubenswrapper[4750]: I1008 18:30:47.791768 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.499153 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.574950 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-config\") pod \"42135b66-6641-4c85-9958-ae210a3de33f\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.575009 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-ovndb-tls-certs\") pod \"42135b66-6641-4c85-9958-ae210a3de33f\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.575073 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-combined-ca-bundle\") pod \"42135b66-6641-4c85-9958-ae210a3de33f\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.575112 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-httpd-config\") pod \"42135b66-6641-4c85-9958-ae210a3de33f\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.575208 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdqct\" (UniqueName: \"kubernetes.io/projected/42135b66-6641-4c85-9958-ae210a3de33f-kube-api-access-jdqct\") pod \"42135b66-6641-4c85-9958-ae210a3de33f\" (UID: \"42135b66-6641-4c85-9958-ae210a3de33f\") " Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.581174 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "42135b66-6641-4c85-9958-ae210a3de33f" (UID: "42135b66-6641-4c85-9958-ae210a3de33f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.606749 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42135b66-6641-4c85-9958-ae210a3de33f-kube-api-access-jdqct" (OuterVolumeSpecName: "kube-api-access-jdqct") pod "42135b66-6641-4c85-9958-ae210a3de33f" (UID: "42135b66-6641-4c85-9958-ae210a3de33f"). InnerVolumeSpecName "kube-api-access-jdqct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.653445 4750 generic.go:334] "Generic (PLEG): container finished" podID="42135b66-6641-4c85-9958-ae210a3de33f" containerID="eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723" exitCode=0 Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.653492 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c5ddbcd5b-g9rd8" event={"ID":"42135b66-6641-4c85-9958-ae210a3de33f","Type":"ContainerDied","Data":"eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723"} Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.653675 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c5ddbcd5b-g9rd8" event={"ID":"42135b66-6641-4c85-9958-ae210a3de33f","Type":"ContainerDied","Data":"98d46e3019cc140d2edfd1ae1ba45f4848c71cbcd5aae8bb050ca125efcaa4bc"} Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.653694 4750 scope.go:117] "RemoveContainer" containerID="52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.653541 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c5ddbcd5b-g9rd8" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.655993 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-config" (OuterVolumeSpecName: "config") pod "42135b66-6641-4c85-9958-ae210a3de33f" (UID: "42135b66-6641-4c85-9958-ae210a3de33f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.656164 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42135b66-6641-4c85-9958-ae210a3de33f" (UID: "42135b66-6641-4c85-9958-ae210a3de33f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.666478 4750 generic.go:334] "Generic (PLEG): container finished" podID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerID="30a104a722ddd8380e531e4665c437eea755c67a159a411d8f4f066da5ad1019" exitCode=0 Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.666511 4750 generic.go:334] "Generic (PLEG): container finished" podID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerID="1407f1071110e4b3c4ee405aa9bfca8a5d4abf60a64a0f3113661db2e1f9f23c" exitCode=2 Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.666520 4750 generic.go:334] "Generic (PLEG): container finished" podID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerID="74cb7426c0f70b5ec395901e6a58475a18c8ff591fb31ded8277938dff088679" exitCode=0 Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.666545 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerDied","Data":"30a104a722ddd8380e531e4665c437eea755c67a159a411d8f4f066da5ad1019"} Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.666595 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerDied","Data":"1407f1071110e4b3c4ee405aa9bfca8a5d4abf60a64a0f3113661db2e1f9f23c"} Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.666608 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerDied","Data":"74cb7426c0f70b5ec395901e6a58475a18c8ff591fb31ded8277938dff088679"} Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.666780 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.667019 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.676714 4750 scope.go:117] "RemoveContainer" containerID="eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.677827 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdqct\" (UniqueName: \"kubernetes.io/projected/42135b66-6641-4c85-9958-ae210a3de33f-kube-api-access-jdqct\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.677858 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.677871 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.677881 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.686060 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "42135b66-6641-4c85-9958-ae210a3de33f" (UID: "42135b66-6641-4c85-9958-ae210a3de33f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.697406 4750 scope.go:117] "RemoveContainer" containerID="52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6" Oct 08 18:30:48 crc kubenswrapper[4750]: E1008 18:30:48.697913 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6\": container with ID starting with 52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6 not found: ID does not exist" containerID="52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.697946 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6"} err="failed to get container status \"52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6\": rpc error: code = NotFound desc = could not find container \"52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6\": container with ID starting with 52f43a6675edc077e2b597f2582515bfa7e90a4f56cba817daa6d003af17b5a6 not found: ID does not exist" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.697971 4750 scope.go:117] "RemoveContainer" containerID="eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723" Oct 08 18:30:48 crc kubenswrapper[4750]: E1008 18:30:48.698323 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723\": container with ID starting with eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723 not found: ID does not exist" containerID="eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.698353 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723"} err="failed to get container status \"eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723\": rpc error: code = NotFound desc = could not find container \"eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723\": container with ID starting with eaa050621406cb9a363caa6299f0498132427ba0b4ca6a7dd901a5fa9d867723 not found: ID does not exist" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.779939 4750 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42135b66-6641-4c85-9958-ae210a3de33f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.978960 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c5ddbcd5b-g9rd8"] Oct 08 18:30:48 crc kubenswrapper[4750]: I1008 18:30:48.989862 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c5ddbcd5b-g9rd8"] Oct 08 18:30:50 crc kubenswrapper[4750]: I1008 18:30:50.522022 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:50 crc kubenswrapper[4750]: I1008 18:30:50.522391 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 18:30:50 crc kubenswrapper[4750]: I1008 18:30:50.749505 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42135b66-6641-4c85-9958-ae210a3de33f" path="/var/lib/kubelet/pods/42135b66-6641-4c85-9958-ae210a3de33f/volumes" Oct 08 18:30:51 crc kubenswrapper[4750]: I1008 18:30:51.138030 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 18:30:51 crc kubenswrapper[4750]: I1008 18:30:51.726857 4750 generic.go:334] "Generic (PLEG): container finished" podID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerID="ff383390c02bd88ed2cece70660933293f84fa0c7cfed4457fb8de26e6ad96f7" exitCode=0 Oct 08 18:30:51 crc kubenswrapper[4750]: I1008 18:30:51.726916 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerDied","Data":"ff383390c02bd88ed2cece70660933293f84fa0c7cfed4457fb8de26e6ad96f7"} Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.331734 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.483145 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-config-data\") pod \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.483221 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-run-httpd\") pod \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.483353 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-sg-core-conf-yaml\") pod \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.483436 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-log-httpd\") pod \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.483464 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-scripts\") pod \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.483513 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-combined-ca-bundle\") pod \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.483571 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5tp2\" (UniqueName: \"kubernetes.io/projected/9415f0f2-766a-4cde-aecb-b0d89cf8326e-kube-api-access-k5tp2\") pod \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\" (UID: \"9415f0f2-766a-4cde-aecb-b0d89cf8326e\") " Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.483799 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9415f0f2-766a-4cde-aecb-b0d89cf8326e" (UID: "9415f0f2-766a-4cde-aecb-b0d89cf8326e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.484011 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.484161 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9415f0f2-766a-4cde-aecb-b0d89cf8326e" (UID: "9415f0f2-766a-4cde-aecb-b0d89cf8326e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.488306 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-scripts" (OuterVolumeSpecName: "scripts") pod "9415f0f2-766a-4cde-aecb-b0d89cf8326e" (UID: "9415f0f2-766a-4cde-aecb-b0d89cf8326e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.489685 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9415f0f2-766a-4cde-aecb-b0d89cf8326e-kube-api-access-k5tp2" (OuterVolumeSpecName: "kube-api-access-k5tp2") pod "9415f0f2-766a-4cde-aecb-b0d89cf8326e" (UID: "9415f0f2-766a-4cde-aecb-b0d89cf8326e"). InnerVolumeSpecName "kube-api-access-k5tp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.507763 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9415f0f2-766a-4cde-aecb-b0d89cf8326e" (UID: "9415f0f2-766a-4cde-aecb-b0d89cf8326e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.560166 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9415f0f2-766a-4cde-aecb-b0d89cf8326e" (UID: "9415f0f2-766a-4cde-aecb-b0d89cf8326e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.573264 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-config-data" (OuterVolumeSpecName: "config-data") pod "9415f0f2-766a-4cde-aecb-b0d89cf8326e" (UID: "9415f0f2-766a-4cde-aecb-b0d89cf8326e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.585822 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.585892 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9415f0f2-766a-4cde-aecb-b0d89cf8326e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.585906 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.585916 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.585929 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5tp2\" (UniqueName: \"kubernetes.io/projected/9415f0f2-766a-4cde-aecb-b0d89cf8326e-kube-api-access-k5tp2\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.585938 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9415f0f2-766a-4cde-aecb-b0d89cf8326e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.787847 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9415f0f2-766a-4cde-aecb-b0d89cf8326e","Type":"ContainerDied","Data":"cda02b7fa5ae7839f8c47abfea351641171175754f69a4e7ab8ebcc8350e4b3b"} Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.787893 4750 scope.go:117] "RemoveContainer" containerID="30a104a722ddd8380e531e4665c437eea755c67a159a411d8f4f066da5ad1019" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.787893 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.789671 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9n56l" event={"ID":"65d622c4-c0ae-4cd3-bdab-101eb0783cc3","Type":"ContainerStarted","Data":"8e086cb3ed64079b1841a6cb5b0743cfbcec3a0c2ccd2a64235e1bcc59f94726"} Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.808861 4750 scope.go:117] "RemoveContainer" containerID="1407f1071110e4b3c4ee405aa9bfca8a5d4abf60a64a0f3113661db2e1f9f23c" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.828621 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9n56l" podStartSLOduration=2.165581457 podStartE2EDuration="11.82860055s" podCreationTimestamp="2025-10-08 18:30:46 +0000 UTC" firstStartedPulling="2025-10-08 18:30:47.423056464 +0000 UTC m=+1203.336027477" lastFinishedPulling="2025-10-08 18:30:57.086075557 +0000 UTC m=+1212.999046570" observedRunningTime="2025-10-08 18:30:57.814002182 +0000 UTC m=+1213.726973225" watchObservedRunningTime="2025-10-08 18:30:57.82860055 +0000 UTC m=+1213.741571573" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.835273 4750 scope.go:117] "RemoveContainer" containerID="74cb7426c0f70b5ec395901e6a58475a18c8ff591fb31ded8277938dff088679" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.837862 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.858337 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.870845 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:57 crc kubenswrapper[4750]: E1008 18:30:57.871320 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42135b66-6641-4c85-9958-ae210a3de33f" containerName="neutron-httpd" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871339 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="42135b66-6641-4c85-9958-ae210a3de33f" containerName="neutron-httpd" Oct 08 18:30:57 crc kubenswrapper[4750]: E1008 18:30:57.871352 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="ceilometer-central-agent" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871359 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="ceilometer-central-agent" Oct 08 18:30:57 crc kubenswrapper[4750]: E1008 18:30:57.871378 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42135b66-6641-4c85-9958-ae210a3de33f" containerName="neutron-api" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871384 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="42135b66-6641-4c85-9958-ae210a3de33f" containerName="neutron-api" Oct 08 18:30:57 crc kubenswrapper[4750]: E1008 18:30:57.871409 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="ceilometer-notification-agent" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871415 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="ceilometer-notification-agent" Oct 08 18:30:57 crc kubenswrapper[4750]: E1008 18:30:57.871426 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="sg-core" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871433 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="sg-core" Oct 08 18:30:57 crc kubenswrapper[4750]: E1008 18:30:57.871446 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="proxy-httpd" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871451 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="proxy-httpd" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871623 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="42135b66-6641-4c85-9958-ae210a3de33f" containerName="neutron-api" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871641 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="sg-core" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871654 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="ceilometer-central-agent" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871660 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="42135b66-6641-4c85-9958-ae210a3de33f" containerName="neutron-httpd" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871666 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="proxy-httpd" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.871683 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" containerName="ceilometer-notification-agent" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.872035 4750 scope.go:117] "RemoveContainer" containerID="ff383390c02bd88ed2cece70660933293f84fa0c7cfed4457fb8de26e6ad96f7" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.873436 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.876362 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.877060 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.902652 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.995437 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-config-data\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.995489 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-scripts\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.995528 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.995648 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-log-httpd\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.995676 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5nx\" (UniqueName: \"kubernetes.io/projected/6757dc4d-417f-4ee5-a705-32cf0203cb39-kube-api-access-lc5nx\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.995719 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-run-httpd\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:57 crc kubenswrapper[4750]: I1008 18:30:57.995758 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.096974 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-log-httpd\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.097017 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5nx\" (UniqueName: \"kubernetes.io/projected/6757dc4d-417f-4ee5-a705-32cf0203cb39-kube-api-access-lc5nx\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.097056 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-run-httpd\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.097090 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.097121 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-config-data\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.097140 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-scripts\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.097164 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.097367 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-log-httpd\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.097738 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-run-httpd\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.102095 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.111066 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-scripts\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.119123 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-config-data\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.124202 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.142787 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5nx\" (UniqueName: \"kubernetes.io/projected/6757dc4d-417f-4ee5-a705-32cf0203cb39-kube-api-access-lc5nx\") pod \"ceilometer-0\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: W1008 18:30:58.179795 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9415f0f2_766a_4cde_aecb_b0d89cf8326e.slice/crio-1407f1071110e4b3c4ee405aa9bfca8a5d4abf60a64a0f3113661db2e1f9f23c.scope WatchSource:0}: Error finding container 1407f1071110e4b3c4ee405aa9bfca8a5d4abf60a64a0f3113661db2e1f9f23c: Status 404 returned error can't find the container with id 1407f1071110e4b3c4ee405aa9bfca8a5d4abf60a64a0f3113661db2e1f9f23c Oct 08 18:30:58 crc kubenswrapper[4750]: W1008 18:30:58.183826 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9415f0f2_766a_4cde_aecb_b0d89cf8326e.slice/crio-30a104a722ddd8380e531e4665c437eea755c67a159a411d8f4f066da5ad1019.scope WatchSource:0}: Error finding container 30a104a722ddd8380e531e4665c437eea755c67a159a411d8f4f066da5ad1019: Status 404 returned error can't find the container with id 30a104a722ddd8380e531e4665c437eea755c67a159a411d8f4f066da5ad1019 Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.196113 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:30:58 crc kubenswrapper[4750]: E1008 18:30:58.419029 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19fe7265_7f86_4acd_95a1_0729ed834f0d.slice/crio-conmon-9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19fe7265_7f86_4acd_95a1_0729ed834f0d.slice/crio-9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c.scope\": RecentStats: unable to find data in memory cache]" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.538493 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.712308 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fe7265-7f86-4acd-95a1-0729ed834f0d-etc-machine-id\") pod \"19fe7265-7f86-4acd-95a1-0729ed834f0d\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.712392 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnzsm\" (UniqueName: \"kubernetes.io/projected/19fe7265-7f86-4acd-95a1-0729ed834f0d-kube-api-access-pnzsm\") pod \"19fe7265-7f86-4acd-95a1-0729ed834f0d\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.712444 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19fe7265-7f86-4acd-95a1-0729ed834f0d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "19fe7265-7f86-4acd-95a1-0729ed834f0d" (UID: "19fe7265-7f86-4acd-95a1-0729ed834f0d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.712480 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data\") pod \"19fe7265-7f86-4acd-95a1-0729ed834f0d\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.712526 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data-custom\") pod \"19fe7265-7f86-4acd-95a1-0729ed834f0d\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.712593 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fe7265-7f86-4acd-95a1-0729ed834f0d-logs\") pod \"19fe7265-7f86-4acd-95a1-0729ed834f0d\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.712628 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-scripts\") pod \"19fe7265-7f86-4acd-95a1-0729ed834f0d\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.712675 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-combined-ca-bundle\") pod \"19fe7265-7f86-4acd-95a1-0729ed834f0d\" (UID: \"19fe7265-7f86-4acd-95a1-0729ed834f0d\") " Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.713215 4750 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fe7265-7f86-4acd-95a1-0729ed834f0d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.717252 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19fe7265-7f86-4acd-95a1-0729ed834f0d-logs" (OuterVolumeSpecName: "logs") pod "19fe7265-7f86-4acd-95a1-0729ed834f0d" (UID: "19fe7265-7f86-4acd-95a1-0729ed834f0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.718969 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fe7265-7f86-4acd-95a1-0729ed834f0d-kube-api-access-pnzsm" (OuterVolumeSpecName: "kube-api-access-pnzsm") pod "19fe7265-7f86-4acd-95a1-0729ed834f0d" (UID: "19fe7265-7f86-4acd-95a1-0729ed834f0d"). InnerVolumeSpecName "kube-api-access-pnzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.719714 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "19fe7265-7f86-4acd-95a1-0729ed834f0d" (UID: "19fe7265-7f86-4acd-95a1-0729ed834f0d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.719984 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-scripts" (OuterVolumeSpecName: "scripts") pod "19fe7265-7f86-4acd-95a1-0729ed834f0d" (UID: "19fe7265-7f86-4acd-95a1-0729ed834f0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.748193 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19fe7265-7f86-4acd-95a1-0729ed834f0d" (UID: "19fe7265-7f86-4acd-95a1-0729ed834f0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.748737 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9415f0f2-766a-4cde-aecb-b0d89cf8326e" path="/var/lib/kubelet/pods/9415f0f2-766a-4cde-aecb-b0d89cf8326e/volumes" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.770195 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.785968 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data" (OuterVolumeSpecName: "config-data") pod "19fe7265-7f86-4acd-95a1-0729ed834f0d" (UID: "19fe7265-7f86-4acd-95a1-0729ed834f0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.799081 4750 generic.go:334] "Generic (PLEG): container finished" podID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerID="9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c" exitCode=137 Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.799145 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.799153 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fe7265-7f86-4acd-95a1-0729ed834f0d","Type":"ContainerDied","Data":"9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c"} Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.799182 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"19fe7265-7f86-4acd-95a1-0729ed834f0d","Type":"ContainerDied","Data":"62c505b289e16e3be66c0dd3b412405b43e776f7e048d950e81dbcd2595b644a"} Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.799204 4750 scope.go:117] "RemoveContainer" containerID="9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.804630 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerStarted","Data":"e4c16370a8d799d54a1da3f833d07e05ea89e882e8640d4c42d8cec6204e004b"} Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.814149 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnzsm\" (UniqueName: \"kubernetes.io/projected/19fe7265-7f86-4acd-95a1-0729ed834f0d-kube-api-access-pnzsm\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.814297 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.814352 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.814433 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fe7265-7f86-4acd-95a1-0729ed834f0d-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.814495 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.814562 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fe7265-7f86-4acd-95a1-0729ed834f0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.867231 4750 scope.go:117] "RemoveContainer" containerID="71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.887665 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.910310 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.912757 4750 scope.go:117] "RemoveContainer" containerID="9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c" Oct 08 18:30:58 crc kubenswrapper[4750]: E1008 18:30:58.913372 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c\": container with ID starting with 9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c not found: ID does not exist" containerID="9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.913417 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c"} err="failed to get container status \"9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c\": rpc error: code = NotFound desc = could not find container \"9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c\": container with ID starting with 9c04d574f99861a3dae0161b412e284aed8e897fc1e2f491dddbadbcf904004c not found: ID does not exist" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.913444 4750 scope.go:117] "RemoveContainer" containerID="71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897" Oct 08 18:30:58 crc kubenswrapper[4750]: E1008 18:30:58.920512 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897\": container with ID starting with 71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897 not found: ID does not exist" containerID="71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.920560 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897"} err="failed to get container status \"71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897\": rpc error: code = NotFound desc = could not find container \"71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897\": container with ID starting with 71adc0ddaf1cd4398295556bffc05adb8fde142d5f65177770ffeb1ed8604897 not found: ID does not exist" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.922622 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:58 crc kubenswrapper[4750]: E1008 18:30:58.923127 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerName="cinder-api-log" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.923151 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerName="cinder-api-log" Oct 08 18:30:58 crc kubenswrapper[4750]: E1008 18:30:58.923171 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerName="cinder-api" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.923178 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerName="cinder-api" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.923345 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerName="cinder-api-log" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.923370 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" containerName="cinder-api" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.924413 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.926404 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.928811 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.929177 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 08 18:30:58 crc kubenswrapper[4750]: I1008 18:30:58.936571 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.017729 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.017994 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.018097 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fjp\" (UniqueName: \"kubernetes.io/projected/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-kube-api-access-45fjp\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.018203 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.018275 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.018353 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.018652 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-scripts\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.018838 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-logs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.018932 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120351 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120415 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120448 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120532 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-scripts\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120630 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-logs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120683 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120713 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120756 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120784 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fjp\" (UniqueName: \"kubernetes.io/projected/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-kube-api-access-45fjp\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.120918 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.121152 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-logs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.124327 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.124801 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.125310 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.126260 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.126288 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.141733 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-scripts\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.147524 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fjp\" (UniqueName: \"kubernetes.io/projected/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-kube-api-access-45fjp\") pod \"cinder-api-0\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.243644 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.774840 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.815844 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerStarted","Data":"ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974"} Oct 08 18:30:59 crc kubenswrapper[4750]: I1008 18:30:59.817815 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f1b5ad2e-1ee1-4955-99c2-8daed456b21c","Type":"ContainerStarted","Data":"ff15fd208575a05ef4944ca42fa9ef211d8a8fc43f95afbae8019c7bbc5ec593"} Oct 08 18:31:00 crc kubenswrapper[4750]: I1008 18:31:00.749125 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fe7265-7f86-4acd-95a1-0729ed834f0d" path="/var/lib/kubelet/pods/19fe7265-7f86-4acd-95a1-0729ed834f0d/volumes" Oct 08 18:31:00 crc kubenswrapper[4750]: I1008 18:31:00.833222 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f1b5ad2e-1ee1-4955-99c2-8daed456b21c","Type":"ContainerStarted","Data":"ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4"} Oct 08 18:31:00 crc kubenswrapper[4750]: I1008 18:31:00.836631 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerStarted","Data":"2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7"} Oct 08 18:31:01 crc kubenswrapper[4750]: I1008 18:31:01.849729 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f1b5ad2e-1ee1-4955-99c2-8daed456b21c","Type":"ContainerStarted","Data":"e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7"} Oct 08 18:31:01 crc kubenswrapper[4750]: I1008 18:31:01.850406 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 18:31:01 crc kubenswrapper[4750]: I1008 18:31:01.851806 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerStarted","Data":"97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b"} Oct 08 18:31:02 crc kubenswrapper[4750]: I1008 18:31:02.814197 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.814180172 podStartE2EDuration="4.814180172s" podCreationTimestamp="2025-10-08 18:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:01.869009174 +0000 UTC m=+1217.781980187" watchObservedRunningTime="2025-10-08 18:31:02.814180172 +0000 UTC m=+1218.727151185" Oct 08 18:31:02 crc kubenswrapper[4750]: I1008 18:31:02.821325 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:03 crc kubenswrapper[4750]: I1008 18:31:03.879703 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerStarted","Data":"bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025"} Oct 08 18:31:03 crc kubenswrapper[4750]: I1008 18:31:03.881217 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="ceilometer-central-agent" containerID="cri-o://ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974" gracePeriod=30 Oct 08 18:31:03 crc kubenswrapper[4750]: I1008 18:31:03.881834 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 18:31:03 crc kubenswrapper[4750]: I1008 18:31:03.883071 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="proxy-httpd" containerID="cri-o://bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025" gracePeriod=30 Oct 08 18:31:03 crc kubenswrapper[4750]: I1008 18:31:03.883196 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="sg-core" containerID="cri-o://97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b" gracePeriod=30 Oct 08 18:31:03 crc kubenswrapper[4750]: I1008 18:31:03.883294 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="ceilometer-notification-agent" containerID="cri-o://2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7" gracePeriod=30 Oct 08 18:31:03 crc kubenswrapper[4750]: I1008 18:31:03.907406 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.242484702 podStartE2EDuration="6.907388676s" podCreationTimestamp="2025-10-08 18:30:57 +0000 UTC" firstStartedPulling="2025-10-08 18:30:58.770805015 +0000 UTC m=+1214.683776018" lastFinishedPulling="2025-10-08 18:31:03.435708979 +0000 UTC m=+1219.348679992" observedRunningTime="2025-10-08 18:31:03.898118639 +0000 UTC m=+1219.811089662" watchObservedRunningTime="2025-10-08 18:31:03.907388676 +0000 UTC m=+1219.820359689" Oct 08 18:31:04 crc kubenswrapper[4750]: I1008 18:31:04.890309 4750 generic.go:334] "Generic (PLEG): container finished" podID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerID="bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025" exitCode=0 Oct 08 18:31:04 crc kubenswrapper[4750]: I1008 18:31:04.890628 4750 generic.go:334] "Generic (PLEG): container finished" podID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerID="97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b" exitCode=2 Oct 08 18:31:04 crc kubenswrapper[4750]: I1008 18:31:04.890639 4750 generic.go:334] "Generic (PLEG): container finished" podID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerID="2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7" exitCode=0 Oct 08 18:31:04 crc kubenswrapper[4750]: I1008 18:31:04.890374 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerDied","Data":"bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025"} Oct 08 18:31:04 crc kubenswrapper[4750]: I1008 18:31:04.890668 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerDied","Data":"97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b"} Oct 08 18:31:04 crc kubenswrapper[4750]: I1008 18:31:04.890678 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerDied","Data":"2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7"} Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.294701 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.434495 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-scripts\") pod \"6757dc4d-417f-4ee5-a705-32cf0203cb39\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.434544 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-combined-ca-bundle\") pod \"6757dc4d-417f-4ee5-a705-32cf0203cb39\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.434605 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-run-httpd\") pod \"6757dc4d-417f-4ee5-a705-32cf0203cb39\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.434681 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-log-httpd\") pod \"6757dc4d-417f-4ee5-a705-32cf0203cb39\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.434748 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-sg-core-conf-yaml\") pod \"6757dc4d-417f-4ee5-a705-32cf0203cb39\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.434812 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-config-data\") pod \"6757dc4d-417f-4ee5-a705-32cf0203cb39\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.434851 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5nx\" (UniqueName: \"kubernetes.io/projected/6757dc4d-417f-4ee5-a705-32cf0203cb39-kube-api-access-lc5nx\") pod \"6757dc4d-417f-4ee5-a705-32cf0203cb39\" (UID: \"6757dc4d-417f-4ee5-a705-32cf0203cb39\") " Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.436663 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6757dc4d-417f-4ee5-a705-32cf0203cb39" (UID: "6757dc4d-417f-4ee5-a705-32cf0203cb39"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.436701 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6757dc4d-417f-4ee5-a705-32cf0203cb39" (UID: "6757dc4d-417f-4ee5-a705-32cf0203cb39"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.442941 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6757dc4d-417f-4ee5-a705-32cf0203cb39-kube-api-access-lc5nx" (OuterVolumeSpecName: "kube-api-access-lc5nx") pod "6757dc4d-417f-4ee5-a705-32cf0203cb39" (UID: "6757dc4d-417f-4ee5-a705-32cf0203cb39"). InnerVolumeSpecName "kube-api-access-lc5nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.443456 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-scripts" (OuterVolumeSpecName: "scripts") pod "6757dc4d-417f-4ee5-a705-32cf0203cb39" (UID: "6757dc4d-417f-4ee5-a705-32cf0203cb39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.474950 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6757dc4d-417f-4ee5-a705-32cf0203cb39" (UID: "6757dc4d-417f-4ee5-a705-32cf0203cb39"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.537659 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6757dc4d-417f-4ee5-a705-32cf0203cb39" (UID: "6757dc4d-417f-4ee5-a705-32cf0203cb39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.538378 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5nx\" (UniqueName: \"kubernetes.io/projected/6757dc4d-417f-4ee5-a705-32cf0203cb39-kube-api-access-lc5nx\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.538400 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.538409 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.538418 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.538426 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6757dc4d-417f-4ee5-a705-32cf0203cb39-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.538433 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.547894 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-config-data" (OuterVolumeSpecName: "config-data") pod "6757dc4d-417f-4ee5-a705-32cf0203cb39" (UID: "6757dc4d-417f-4ee5-a705-32cf0203cb39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.640950 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6757dc4d-417f-4ee5-a705-32cf0203cb39-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.901962 4750 generic.go:334] "Generic (PLEG): container finished" podID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerID="ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974" exitCode=0 Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.902000 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerDied","Data":"ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974"} Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.902047 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6757dc4d-417f-4ee5-a705-32cf0203cb39","Type":"ContainerDied","Data":"e4c16370a8d799d54a1da3f833d07e05ea89e882e8640d4c42d8cec6204e004b"} Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.902066 4750 scope.go:117] "RemoveContainer" containerID="bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.902138 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.936563 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.937697 4750 scope.go:117] "RemoveContainer" containerID="97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.943769 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958026 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:05 crc kubenswrapper[4750]: E1008 18:31:05.958402 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="proxy-httpd" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958414 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="proxy-httpd" Oct 08 18:31:05 crc kubenswrapper[4750]: E1008 18:31:05.958436 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="sg-core" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958444 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="sg-core" Oct 08 18:31:05 crc kubenswrapper[4750]: E1008 18:31:05.958468 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="ceilometer-central-agent" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958476 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="ceilometer-central-agent" Oct 08 18:31:05 crc kubenswrapper[4750]: E1008 18:31:05.958490 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="ceilometer-notification-agent" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958495 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="ceilometer-notification-agent" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958667 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="sg-core" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958682 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="ceilometer-central-agent" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958695 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="proxy-httpd" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.958716 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" containerName="ceilometer-notification-agent" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.960994 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.964183 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.965127 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.971191 4750 scope.go:117] "RemoveContainer" containerID="2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7" Oct 08 18:31:05 crc kubenswrapper[4750]: I1008 18:31:05.975616 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:05.999725 4750 scope.go:117] "RemoveContainer" containerID="ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.024122 4750 scope.go:117] "RemoveContainer" containerID="bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025" Oct 08 18:31:06 crc kubenswrapper[4750]: E1008 18:31:06.025004 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025\": container with ID starting with bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025 not found: ID does not exist" containerID="bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.025048 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025"} err="failed to get container status \"bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025\": rpc error: code = NotFound desc = could not find container \"bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025\": container with ID starting with bc43f25fa1070dc72f81a7df5e9dab184040bdb949affb3807455b09ce52d025 not found: ID does not exist" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.025076 4750 scope.go:117] "RemoveContainer" containerID="97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b" Oct 08 18:31:06 crc kubenswrapper[4750]: E1008 18:31:06.025365 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b\": container with ID starting with 97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b not found: ID does not exist" containerID="97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.025399 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b"} err="failed to get container status \"97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b\": rpc error: code = NotFound desc = could not find container \"97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b\": container with ID starting with 97ab7ebd2889c9d1c76fc352669db328a276b75c294806f628647edabd22ef2b not found: ID does not exist" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.025426 4750 scope.go:117] "RemoveContainer" containerID="2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7" Oct 08 18:31:06 crc kubenswrapper[4750]: E1008 18:31:06.025677 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7\": container with ID starting with 2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7 not found: ID does not exist" containerID="2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.025697 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7"} err="failed to get container status \"2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7\": rpc error: code = NotFound desc = could not find container \"2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7\": container with ID starting with 2838c747ad2a44a5ba3a1f15b89b0f5f171fd8814eb13cfeda6c47d83bd77db7 not found: ID does not exist" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.025714 4750 scope.go:117] "RemoveContainer" containerID="ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974" Oct 08 18:31:06 crc kubenswrapper[4750]: E1008 18:31:06.026018 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974\": container with ID starting with ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974 not found: ID does not exist" containerID="ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.026036 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974"} err="failed to get container status \"ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974\": rpc error: code = NotFound desc = could not find container \"ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974\": container with ID starting with ce8746ddceec675f97f535436ab598a5f6adfde4bad74ea69e12334b2ca2f974 not found: ID does not exist" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.047166 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.047204 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rkgg\" (UniqueName: \"kubernetes.io/projected/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-kube-api-access-4rkgg\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.048013 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-log-httpd\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.048046 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-scripts\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.048171 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-config-data\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.048201 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-run-httpd\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.048225 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150013 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-config-data\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150064 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-run-httpd\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150101 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150121 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150137 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rkgg\" (UniqueName: \"kubernetes.io/projected/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-kube-api-access-4rkgg\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150211 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-log-httpd\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150249 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-scripts\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150534 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-run-httpd\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.150783 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-log-httpd\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.153816 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.155056 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-config-data\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.155117 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.155447 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-scripts\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.166038 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rkgg\" (UniqueName: \"kubernetes.io/projected/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-kube-api-access-4rkgg\") pod \"ceilometer-0\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.285309 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.718302 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.744774 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6757dc4d-417f-4ee5-a705-32cf0203cb39" path="/var/lib/kubelet/pods/6757dc4d-417f-4ee5-a705-32cf0203cb39/volumes" Oct 08 18:31:06 crc kubenswrapper[4750]: I1008 18:31:06.910585 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerStarted","Data":"8f85c44d2f99d5d5093d0b82130fe74e10e5fb6edd9e311e77cc6bc35bf5e331"} Oct 08 18:31:07 crc kubenswrapper[4750]: I1008 18:31:07.921159 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerStarted","Data":"1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9"} Oct 08 18:31:08 crc kubenswrapper[4750]: I1008 18:31:08.931624 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerStarted","Data":"148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e"} Oct 08 18:31:09 crc kubenswrapper[4750]: I1008 18:31:09.945843 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerStarted","Data":"3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27"} Oct 08 18:31:09 crc kubenswrapper[4750]: I1008 18:31:09.947888 4750 generic.go:334] "Generic (PLEG): container finished" podID="65d622c4-c0ae-4cd3-bdab-101eb0783cc3" containerID="8e086cb3ed64079b1841a6cb5b0743cfbcec3a0c2ccd2a64235e1bcc59f94726" exitCode=0 Oct 08 18:31:09 crc kubenswrapper[4750]: I1008 18:31:09.947944 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9n56l" event={"ID":"65d622c4-c0ae-4cd3-bdab-101eb0783cc3","Type":"ContainerDied","Data":"8e086cb3ed64079b1841a6cb5b0743cfbcec3a0c2ccd2a64235e1bcc59f94726"} Oct 08 18:31:10 crc kubenswrapper[4750]: I1008 18:31:10.958963 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerStarted","Data":"1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051"} Oct 08 18:31:10 crc kubenswrapper[4750]: I1008 18:31:10.992540 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.053903629 podStartE2EDuration="5.992524349s" podCreationTimestamp="2025-10-08 18:31:05 +0000 UTC" firstStartedPulling="2025-10-08 18:31:06.729291746 +0000 UTC m=+1222.642262759" lastFinishedPulling="2025-10-08 18:31:10.667912466 +0000 UTC m=+1226.580883479" observedRunningTime="2025-10-08 18:31:10.986162923 +0000 UTC m=+1226.899133956" watchObservedRunningTime="2025-10-08 18:31:10.992524349 +0000 UTC m=+1226.905495362" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.267041 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.490052 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.547164 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-scripts\") pod \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.547364 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-combined-ca-bundle\") pod \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.547428 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nbwp\" (UniqueName: \"kubernetes.io/projected/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-kube-api-access-6nbwp\") pod \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.547458 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-config-data\") pod \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\" (UID: \"65d622c4-c0ae-4cd3-bdab-101eb0783cc3\") " Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.553914 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-kube-api-access-6nbwp" (OuterVolumeSpecName: "kube-api-access-6nbwp") pod "65d622c4-c0ae-4cd3-bdab-101eb0783cc3" (UID: "65d622c4-c0ae-4cd3-bdab-101eb0783cc3"). InnerVolumeSpecName "kube-api-access-6nbwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.555762 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-scripts" (OuterVolumeSpecName: "scripts") pod "65d622c4-c0ae-4cd3-bdab-101eb0783cc3" (UID: "65d622c4-c0ae-4cd3-bdab-101eb0783cc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.577249 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-config-data" (OuterVolumeSpecName: "config-data") pod "65d622c4-c0ae-4cd3-bdab-101eb0783cc3" (UID: "65d622c4-c0ae-4cd3-bdab-101eb0783cc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.588024 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65d622c4-c0ae-4cd3-bdab-101eb0783cc3" (UID: "65d622c4-c0ae-4cd3-bdab-101eb0783cc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.650414 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.650452 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nbwp\" (UniqueName: \"kubernetes.io/projected/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-kube-api-access-6nbwp\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.650464 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.650473 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d622c4-c0ae-4cd3-bdab-101eb0783cc3-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.968736 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9n56l" event={"ID":"65d622c4-c0ae-4cd3-bdab-101eb0783cc3","Type":"ContainerDied","Data":"92ddb6487bee146bf7d9f9629d471f4f733ce9733f97533459f6d699d5a543d9"} Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.968770 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9n56l" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.968791 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ddb6487bee146bf7d9f9629d471f4f733ce9733f97533459f6d699d5a543d9" Oct 08 18:31:11 crc kubenswrapper[4750]: I1008 18:31:11.968904 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.063866 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 18:31:12 crc kubenswrapper[4750]: E1008 18:31:12.064513 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d622c4-c0ae-4cd3-bdab-101eb0783cc3" containerName="nova-cell0-conductor-db-sync" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.064718 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d622c4-c0ae-4cd3-bdab-101eb0783cc3" containerName="nova-cell0-conductor-db-sync" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.064928 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d622c4-c0ae-4cd3-bdab-101eb0783cc3" containerName="nova-cell0-conductor-db-sync" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.065499 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.067455 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.080703 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-b8qmk" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.082716 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.159583 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nh8n\" (UniqueName: \"kubernetes.io/projected/038b3881-b266-4878-b395-87d7bf986446-kube-api-access-8nh8n\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.160028 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.160159 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.261632 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.261710 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.261753 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nh8n\" (UniqueName: \"kubernetes.io/projected/038b3881-b266-4878-b395-87d7bf986446-kube-api-access-8nh8n\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.268024 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.269219 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.277836 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nh8n\" (UniqueName: \"kubernetes.io/projected/038b3881-b266-4878-b395-87d7bf986446-kube-api-access-8nh8n\") pod \"nova-cell0-conductor-0\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.382927 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.862333 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 18:31:12 crc kubenswrapper[4750]: W1008 18:31:12.867372 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod038b3881_b266_4878_b395_87d7bf986446.slice/crio-58b8db3b53f1a400f73a597c0554b6ffbc1c1d9f9071d20e55d07f52500acfd4 WatchSource:0}: Error finding container 58b8db3b53f1a400f73a597c0554b6ffbc1c1d9f9071d20e55d07f52500acfd4: Status 404 returned error can't find the container with id 58b8db3b53f1a400f73a597c0554b6ffbc1c1d9f9071d20e55d07f52500acfd4 Oct 08 18:31:12 crc kubenswrapper[4750]: I1008 18:31:12.981183 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"038b3881-b266-4878-b395-87d7bf986446","Type":"ContainerStarted","Data":"58b8db3b53f1a400f73a597c0554b6ffbc1c1d9f9071d20e55d07f52500acfd4"} Oct 08 18:31:13 crc kubenswrapper[4750]: I1008 18:31:13.993831 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"038b3881-b266-4878-b395-87d7bf986446","Type":"ContainerStarted","Data":"7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0"} Oct 08 18:31:13 crc kubenswrapper[4750]: I1008 18:31:13.994673 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:14 crc kubenswrapper[4750]: I1008 18:31:14.012257 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.012234144 podStartE2EDuration="2.012234144s" podCreationTimestamp="2025-10-08 18:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:14.009485767 +0000 UTC m=+1229.922456800" watchObservedRunningTime="2025-10-08 18:31:14.012234144 +0000 UTC m=+1229.925205167" Oct 08 18:31:22 crc kubenswrapper[4750]: I1008 18:31:22.424038 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 18:31:22 crc kubenswrapper[4750]: I1008 18:31:22.887863 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ftcwm"] Oct 08 18:31:22 crc kubenswrapper[4750]: I1008 18:31:22.889219 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:22 crc kubenswrapper[4750]: I1008 18:31:22.891274 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 08 18:31:22 crc kubenswrapper[4750]: I1008 18:31:22.892795 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 08 18:31:22 crc kubenswrapper[4750]: I1008 18:31:22.904831 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ftcwm"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.059108 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5tv\" (UniqueName: \"kubernetes.io/projected/1f613fe6-8980-4ded-8c2f-c4222c597cf1-kube-api-access-mg5tv\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.059405 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-config-data\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.059612 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.059765 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-scripts\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.163921 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.164047 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-scripts\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.164203 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5tv\" (UniqueName: \"kubernetes.io/projected/1f613fe6-8980-4ded-8c2f-c4222c597cf1-kube-api-access-mg5tv\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.164259 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-config-data\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.177427 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.179958 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.182071 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-config-data\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.184746 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-scripts\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.194070 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.194859 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.228732 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5tv\" (UniqueName: \"kubernetes.io/projected/1f613fe6-8980-4ded-8c2f-c4222c597cf1-kube-api-access-mg5tv\") pod \"nova-cell0-cell-mapping-ftcwm\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.248395 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.260763 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.262372 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.273379 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.305195 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.308604 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.318832 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.318994 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.333734 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.373148 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.373195 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.373330 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-config-data\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.373370 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prhb\" (UniqueName: \"kubernetes.io/projected/90f77db6-aa1c-4d0c-8598-51de62f090d5-kube-api-access-8prhb\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.373427 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.373500 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvxmh\" (UniqueName: \"kubernetes.io/projected/924f1d18-46bd-420b-8250-6100ae1c7120-kube-api-access-bvxmh\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.373676 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90f77db6-aa1c-4d0c-8598-51de62f090d5-logs\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.384606 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-tl7wv"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.386766 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.396096 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-tl7wv"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.415525 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.416731 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.419834 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.453591 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.477128 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-config-data\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.477211 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8prhb\" (UniqueName: \"kubernetes.io/projected/90f77db6-aa1c-4d0c-8598-51de62f090d5-kube-api-access-8prhb\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.477593 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-config-data\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.477710 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.477816 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.477905 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvxmh\" (UniqueName: \"kubernetes.io/projected/924f1d18-46bd-420b-8250-6100ae1c7120-kube-api-access-bvxmh\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.478039 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90f77db6-aa1c-4d0c-8598-51de62f090d5-logs\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.478136 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72144e8-79ad-4e02-91fe-bafa5de04a58-logs\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.478212 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.478288 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.478375 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq5q4\" (UniqueName: \"kubernetes.io/projected/c72144e8-79ad-4e02-91fe-bafa5de04a58-kube-api-access-kq5q4\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.479721 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90f77db6-aa1c-4d0c-8598-51de62f090d5-logs\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.483764 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.488722 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.495227 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-config-data\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.497229 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.499315 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvxmh\" (UniqueName: \"kubernetes.io/projected/924f1d18-46bd-420b-8250-6100ae1c7120-kube-api-access-bvxmh\") pod \"nova-cell1-novncproxy-0\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.502212 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prhb\" (UniqueName: \"kubernetes.io/projected/90f77db6-aa1c-4d0c-8598-51de62f090d5-kube-api-access-8prhb\") pod \"nova-api-0\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.518053 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.582679 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq5q4\" (UniqueName: \"kubernetes.io/projected/c72144e8-79ad-4e02-91fe-bafa5de04a58-kube-api-access-kq5q4\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.582997 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583020 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583156 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583192 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fwzl\" (UniqueName: \"kubernetes.io/projected/f7968787-4100-4e44-b289-0511fe895128-kube-api-access-4fwzl\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583236 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcc5w\" (UniqueName: \"kubernetes.io/projected/4460ef21-2426-4fc2-bba3-147fdd612a0c-kube-api-access-kcc5w\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583319 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-config-data\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583344 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583504 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-svc\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583574 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-config-data\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583592 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583628 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-config\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.583653 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72144e8-79ad-4e02-91fe-bafa5de04a58-logs\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.584716 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72144e8-79ad-4e02-91fe-bafa5de04a58-logs\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.589452 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.590792 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-config-data\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.602211 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq5q4\" (UniqueName: \"kubernetes.io/projected/c72144e8-79ad-4e02-91fe-bafa5de04a58-kube-api-access-kq5q4\") pod \"nova-metadata-0\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.619157 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.638771 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.667714 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685033 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-config\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685119 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685167 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685217 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685238 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fwzl\" (UniqueName: \"kubernetes.io/projected/f7968787-4100-4e44-b289-0511fe895128-kube-api-access-4fwzl\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685261 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcc5w\" (UniqueName: \"kubernetes.io/projected/4460ef21-2426-4fc2-bba3-147fdd612a0c-kube-api-access-kcc5w\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685338 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-svc\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685369 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-config-data\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.685385 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.689205 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.690199 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-svc\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.690430 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.691238 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.693140 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-config\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.700251 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.707568 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-config-data\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.714168 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcc5w\" (UniqueName: \"kubernetes.io/projected/4460ef21-2426-4fc2-bba3-147fdd612a0c-kube-api-access-kcc5w\") pod \"nova-scheduler-0\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.726395 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fwzl\" (UniqueName: \"kubernetes.io/projected/f7968787-4100-4e44-b289-0511fe895128-kube-api-access-4fwzl\") pod \"dnsmasq-dns-6ffc974fdf-tl7wv\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.754946 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:23 crc kubenswrapper[4750]: I1008 18:31:23.763114 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.009612 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kxxb9"] Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.010878 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.013883 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.014215 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.017242 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kxxb9"] Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.096806 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ftcwm"] Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.198361 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-scripts\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.198876 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p66pl\" (UniqueName: \"kubernetes.io/projected/6c616741-8a43-456e-a249-aee7e4d3764f-kube-api-access-p66pl\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.198952 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.198991 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-config-data\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.202733 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.231674 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.301592 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.301652 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-config-data\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.301723 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-scripts\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.301779 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p66pl\" (UniqueName: \"kubernetes.io/projected/6c616741-8a43-456e-a249-aee7e4d3764f-kube-api-access-p66pl\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.312586 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.328273 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-config-data\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.335102 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-scripts\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.336030 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p66pl\" (UniqueName: \"kubernetes.io/projected/6c616741-8a43-456e-a249-aee7e4d3764f-kube-api-access-p66pl\") pod \"nova-cell1-conductor-db-sync-kxxb9\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.341073 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.379921 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.453229 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.512075 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-tl7wv"] Oct 08 18:31:24 crc kubenswrapper[4750]: W1008 18:31:24.521862 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4460ef21_2426_4fc2_bba3_147fdd612a0c.slice/crio-5d9d23d7d6037a4011884b0e12bf635fd88493b701923002ffce41e007d31d3b WatchSource:0}: Error finding container 5d9d23d7d6037a4011884b0e12bf635fd88493b701923002ffce41e007d31d3b: Status 404 returned error can't find the container with id 5d9d23d7d6037a4011884b0e12bf635fd88493b701923002ffce41e007d31d3b Oct 08 18:31:24 crc kubenswrapper[4750]: I1008 18:31:24.930489 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kxxb9"] Oct 08 18:31:24 crc kubenswrapper[4750]: W1008 18:31:24.931015 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c616741_8a43_456e_a249_aee7e4d3764f.slice/crio-696e58c930f2a4eb34503f3f66372c56fe5d870dfc6f0e1bfede49cd63181314 WatchSource:0}: Error finding container 696e58c930f2a4eb34503f3f66372c56fe5d870dfc6f0e1bfede49cd63181314: Status 404 returned error can't find the container with id 696e58c930f2a4eb34503f3f66372c56fe5d870dfc6f0e1bfede49cd63181314 Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.109610 4750 generic.go:334] "Generic (PLEG): container finished" podID="f7968787-4100-4e44-b289-0511fe895128" containerID="122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350" exitCode=0 Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.109964 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" event={"ID":"f7968787-4100-4e44-b289-0511fe895128","Type":"ContainerDied","Data":"122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.110022 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" event={"ID":"f7968787-4100-4e44-b289-0511fe895128","Type":"ContainerStarted","Data":"948d4d1fc08cc6a261b2b0e6273ecc24d68f6ca1a93578b39fea5101cb3169ff"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.115816 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" event={"ID":"6c616741-8a43-456e-a249-aee7e4d3764f","Type":"ContainerStarted","Data":"696e58c930f2a4eb34503f3f66372c56fe5d870dfc6f0e1bfede49cd63181314"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.119109 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"924f1d18-46bd-420b-8250-6100ae1c7120","Type":"ContainerStarted","Data":"ef3acda9cc3a846bd768d47f43084e805f1568bdcf8bd4589ecfdfa8032c1aed"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.120598 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4460ef21-2426-4fc2-bba3-147fdd612a0c","Type":"ContainerStarted","Data":"5d9d23d7d6037a4011884b0e12bf635fd88493b701923002ffce41e007d31d3b"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.124005 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ftcwm" event={"ID":"1f613fe6-8980-4ded-8c2f-c4222c597cf1","Type":"ContainerStarted","Data":"1b2e074006d40f2b6f4feb94e1154de720671b362e1e247d9fcda194a8920fbe"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.124051 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ftcwm" event={"ID":"1f613fe6-8980-4ded-8c2f-c4222c597cf1","Type":"ContainerStarted","Data":"2201a273fa8ec2a7be98c0582117e5f6bf636bef133a5b367ee175447ea57b52"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.141107 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c72144e8-79ad-4e02-91fe-bafa5de04a58","Type":"ContainerStarted","Data":"d6a61689980b2c4b15c5bd87e7994e8805df81a2d75b23439652e542b6f4907a"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.152775 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90f77db6-aa1c-4d0c-8598-51de62f090d5","Type":"ContainerStarted","Data":"7a80924022559a90007625acbeb30d74a2105b59f60b48e3dd488328b27d5b85"} Oct 08 18:31:25 crc kubenswrapper[4750]: I1008 18:31:25.158526 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ftcwm" podStartSLOduration=3.158513018 podStartE2EDuration="3.158513018s" podCreationTimestamp="2025-10-08 18:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:25.14879319 +0000 UTC m=+1241.061764203" watchObservedRunningTime="2025-10-08 18:31:25.158513018 +0000 UTC m=+1241.071484051" Oct 08 18:31:26 crc kubenswrapper[4750]: I1008 18:31:26.167352 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" event={"ID":"f7968787-4100-4e44-b289-0511fe895128","Type":"ContainerStarted","Data":"a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750"} Oct 08 18:31:26 crc kubenswrapper[4750]: I1008 18:31:26.167659 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:26 crc kubenswrapper[4750]: I1008 18:31:26.170927 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" event={"ID":"6c616741-8a43-456e-a249-aee7e4d3764f","Type":"ContainerStarted","Data":"0ba5b8a05e8f46a8ac74a1a7956f2c4cec61b825654ab0658e1c4de2d90cdd8b"} Oct 08 18:31:26 crc kubenswrapper[4750]: I1008 18:31:26.199132 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" podStartSLOduration=3.199114665 podStartE2EDuration="3.199114665s" podCreationTimestamp="2025-10-08 18:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:26.188274961 +0000 UTC m=+1242.101245984" watchObservedRunningTime="2025-10-08 18:31:26.199114665 +0000 UTC m=+1242.112085668" Oct 08 18:31:26 crc kubenswrapper[4750]: I1008 18:31:26.210010 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" podStartSLOduration=3.209991061 podStartE2EDuration="3.209991061s" podCreationTimestamp="2025-10-08 18:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:26.205261536 +0000 UTC m=+1242.118232559" watchObservedRunningTime="2025-10-08 18:31:26.209991061 +0000 UTC m=+1242.122962084" Oct 08 18:31:26 crc kubenswrapper[4750]: I1008 18:31:26.874054 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:26 crc kubenswrapper[4750]: I1008 18:31:26.904454 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.208652 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4460ef21-2426-4fc2-bba3-147fdd612a0c","Type":"ContainerStarted","Data":"7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7"} Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.216446 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c72144e8-79ad-4e02-91fe-bafa5de04a58","Type":"ContainerStarted","Data":"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f"} Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.216855 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c72144e8-79ad-4e02-91fe-bafa5de04a58","Type":"ContainerStarted","Data":"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54"} Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.216714 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerName="nova-metadata-metadata" containerID="cri-o://6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f" gracePeriod=30 Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.216640 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerName="nova-metadata-log" containerID="cri-o://0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54" gracePeriod=30 Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.220578 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90f77db6-aa1c-4d0c-8598-51de62f090d5","Type":"ContainerStarted","Data":"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f"} Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.220614 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90f77db6-aa1c-4d0c-8598-51de62f090d5","Type":"ContainerStarted","Data":"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f"} Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.222351 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"924f1d18-46bd-420b-8250-6100ae1c7120","Type":"ContainerStarted","Data":"e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63"} Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.222456 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="924f1d18-46bd-420b-8250-6100ae1c7120" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63" gracePeriod=30 Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.229891 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.797254712 podStartE2EDuration="6.229875162s" podCreationTimestamp="2025-10-08 18:31:23 +0000 UTC" firstStartedPulling="2025-10-08 18:31:24.544873008 +0000 UTC m=+1240.457844011" lastFinishedPulling="2025-10-08 18:31:27.977493458 +0000 UTC m=+1243.890464461" observedRunningTime="2025-10-08 18:31:29.223816604 +0000 UTC m=+1245.136787617" watchObservedRunningTime="2025-10-08 18:31:29.229875162 +0000 UTC m=+1245.142846175" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.254204 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.761941468 podStartE2EDuration="6.254187516s" podCreationTimestamp="2025-10-08 18:31:23 +0000 UTC" firstStartedPulling="2025-10-08 18:31:24.48524863 +0000 UTC m=+1240.398219643" lastFinishedPulling="2025-10-08 18:31:27.977494678 +0000 UTC m=+1243.890465691" observedRunningTime="2025-10-08 18:31:29.248061777 +0000 UTC m=+1245.161032800" watchObservedRunningTime="2025-10-08 18:31:29.254187516 +0000 UTC m=+1245.167158529" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.298478 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.520746583 podStartE2EDuration="6.298458839s" podCreationTimestamp="2025-10-08 18:31:23 +0000 UTC" firstStartedPulling="2025-10-08 18:31:24.212714438 +0000 UTC m=+1240.125685451" lastFinishedPulling="2025-10-08 18:31:27.990426684 +0000 UTC m=+1243.903397707" observedRunningTime="2025-10-08 18:31:29.275303482 +0000 UTC m=+1245.188274495" watchObservedRunningTime="2025-10-08 18:31:29.298458839 +0000 UTC m=+1245.211429862" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.299207 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.566091832 podStartE2EDuration="6.299198107s" podCreationTimestamp="2025-10-08 18:31:23 +0000 UTC" firstStartedPulling="2025-10-08 18:31:24.24221685 +0000 UTC m=+1240.155187863" lastFinishedPulling="2025-10-08 18:31:27.975323125 +0000 UTC m=+1243.888294138" observedRunningTime="2025-10-08 18:31:29.292052442 +0000 UTC m=+1245.205023455" watchObservedRunningTime="2025-10-08 18:31:29.299198107 +0000 UTC m=+1245.212169120" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.786756 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.831127 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-config-data\") pod \"c72144e8-79ad-4e02-91fe-bafa5de04a58\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.831241 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-combined-ca-bundle\") pod \"c72144e8-79ad-4e02-91fe-bafa5de04a58\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.831427 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72144e8-79ad-4e02-91fe-bafa5de04a58-logs\") pod \"c72144e8-79ad-4e02-91fe-bafa5de04a58\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.831582 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq5q4\" (UniqueName: \"kubernetes.io/projected/c72144e8-79ad-4e02-91fe-bafa5de04a58-kube-api-access-kq5q4\") pod \"c72144e8-79ad-4e02-91fe-bafa5de04a58\" (UID: \"c72144e8-79ad-4e02-91fe-bafa5de04a58\") " Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.832230 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72144e8-79ad-4e02-91fe-bafa5de04a58-logs" (OuterVolumeSpecName: "logs") pod "c72144e8-79ad-4e02-91fe-bafa5de04a58" (UID: "c72144e8-79ad-4e02-91fe-bafa5de04a58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.837176 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72144e8-79ad-4e02-91fe-bafa5de04a58-kube-api-access-kq5q4" (OuterVolumeSpecName: "kube-api-access-kq5q4") pod "c72144e8-79ad-4e02-91fe-bafa5de04a58" (UID: "c72144e8-79ad-4e02-91fe-bafa5de04a58"). InnerVolumeSpecName "kube-api-access-kq5q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.863569 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-config-data" (OuterVolumeSpecName: "config-data") pod "c72144e8-79ad-4e02-91fe-bafa5de04a58" (UID: "c72144e8-79ad-4e02-91fe-bafa5de04a58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.864874 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c72144e8-79ad-4e02-91fe-bafa5de04a58" (UID: "c72144e8-79ad-4e02-91fe-bafa5de04a58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.933987 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.934021 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72144e8-79ad-4e02-91fe-bafa5de04a58-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.934030 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq5q4\" (UniqueName: \"kubernetes.io/projected/c72144e8-79ad-4e02-91fe-bafa5de04a58-kube-api-access-kq5q4\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:29 crc kubenswrapper[4750]: I1008 18:31:29.934039 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72144e8-79ad-4e02-91fe-bafa5de04a58-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.235332 4750 generic.go:334] "Generic (PLEG): container finished" podID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerID="6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f" exitCode=0 Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.235369 4750 generic.go:334] "Generic (PLEG): container finished" podID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerID="0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54" exitCode=143 Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.235399 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.235984 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c72144e8-79ad-4e02-91fe-bafa5de04a58","Type":"ContainerDied","Data":"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f"} Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.236048 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c72144e8-79ad-4e02-91fe-bafa5de04a58","Type":"ContainerDied","Data":"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54"} Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.236067 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c72144e8-79ad-4e02-91fe-bafa5de04a58","Type":"ContainerDied","Data":"d6a61689980b2c4b15c5bd87e7994e8805df81a2d75b23439652e542b6f4907a"} Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.236086 4750 scope.go:117] "RemoveContainer" containerID="6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.254152 4750 scope.go:117] "RemoveContainer" containerID="0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.277540 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.286244 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.288871 4750 scope.go:117] "RemoveContainer" containerID="6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f" Oct 08 18:31:30 crc kubenswrapper[4750]: E1008 18:31:30.289793 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f\": container with ID starting with 6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f not found: ID does not exist" containerID="6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.289828 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f"} err="failed to get container status \"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f\": rpc error: code = NotFound desc = could not find container \"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f\": container with ID starting with 6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f not found: ID does not exist" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.289853 4750 scope.go:117] "RemoveContainer" containerID="0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54" Oct 08 18:31:30 crc kubenswrapper[4750]: E1008 18:31:30.290194 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54\": container with ID starting with 0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54 not found: ID does not exist" containerID="0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.290265 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54"} err="failed to get container status \"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54\": rpc error: code = NotFound desc = could not find container \"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54\": container with ID starting with 0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54 not found: ID does not exist" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.290297 4750 scope.go:117] "RemoveContainer" containerID="6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.290831 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f"} err="failed to get container status \"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f\": rpc error: code = NotFound desc = could not find container \"6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f\": container with ID starting with 6ff019da0078532aa5e91316e838a7b9fdeb0ad4f4c84ef48ffb63f239738d9f not found: ID does not exist" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.290869 4750 scope.go:117] "RemoveContainer" containerID="0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.291979 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54"} err="failed to get container status \"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54\": rpc error: code = NotFound desc = could not find container \"0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54\": container with ID starting with 0a34ea04b52bfb3c5b20876768aa7d59e74ef658d644eff2f1acf912a1e79c54 not found: ID does not exist" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.294534 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:30 crc kubenswrapper[4750]: E1008 18:31:30.295072 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerName="nova-metadata-metadata" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.295098 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerName="nova-metadata-metadata" Oct 08 18:31:30 crc kubenswrapper[4750]: E1008 18:31:30.295121 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerName="nova-metadata-log" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.295132 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerName="nova-metadata-log" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.295369 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerName="nova-metadata-metadata" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.295406 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" containerName="nova-metadata-log" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.301214 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.303119 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.303724 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.303927 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.442824 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.443157 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cccj2\" (UniqueName: \"kubernetes.io/projected/02e6583b-2f8d-4d50-9146-65c3f281283e-kube-api-access-cccj2\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.443274 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.443305 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e6583b-2f8d-4d50-9146-65c3f281283e-logs\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.443347 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-config-data\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.544857 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.544910 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e6583b-2f8d-4d50-9146-65c3f281283e-logs\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.544955 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-config-data\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.545073 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.545122 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cccj2\" (UniqueName: \"kubernetes.io/projected/02e6583b-2f8d-4d50-9146-65c3f281283e-kube-api-access-cccj2\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.545449 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e6583b-2f8d-4d50-9146-65c3f281283e-logs\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.548846 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.554399 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.560185 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-config-data\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.564453 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cccj2\" (UniqueName: \"kubernetes.io/projected/02e6583b-2f8d-4d50-9146-65c3f281283e-kube-api-access-cccj2\") pod \"nova-metadata-0\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.646173 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:30 crc kubenswrapper[4750]: I1008 18:31:30.755192 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72144e8-79ad-4e02-91fe-bafa5de04a58" path="/var/lib/kubelet/pods/c72144e8-79ad-4e02-91fe-bafa5de04a58/volumes" Oct 08 18:31:31 crc kubenswrapper[4750]: I1008 18:31:31.116188 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:31 crc kubenswrapper[4750]: W1008 18:31:31.157615 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e6583b_2f8d_4d50_9146_65c3f281283e.slice/crio-abbdfaa6ba52a34b12503584de6cca216a3b10157bf5703271d6330a9a81eac1 WatchSource:0}: Error finding container abbdfaa6ba52a34b12503584de6cca216a3b10157bf5703271d6330a9a81eac1: Status 404 returned error can't find the container with id abbdfaa6ba52a34b12503584de6cca216a3b10157bf5703271d6330a9a81eac1 Oct 08 18:31:31 crc kubenswrapper[4750]: I1008 18:31:31.244391 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02e6583b-2f8d-4d50-9146-65c3f281283e","Type":"ContainerStarted","Data":"abbdfaa6ba52a34b12503584de6cca216a3b10157bf5703271d6330a9a81eac1"} Oct 08 18:31:31 crc kubenswrapper[4750]: I1008 18:31:31.248306 4750 generic.go:334] "Generic (PLEG): container finished" podID="1f613fe6-8980-4ded-8c2f-c4222c597cf1" containerID="1b2e074006d40f2b6f4feb94e1154de720671b362e1e247d9fcda194a8920fbe" exitCode=0 Oct 08 18:31:31 crc kubenswrapper[4750]: I1008 18:31:31.248331 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ftcwm" event={"ID":"1f613fe6-8980-4ded-8c2f-c4222c597cf1","Type":"ContainerDied","Data":"1b2e074006d40f2b6f4feb94e1154de720671b362e1e247d9fcda194a8920fbe"} Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.271295 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02e6583b-2f8d-4d50-9146-65c3f281283e","Type":"ContainerStarted","Data":"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe"} Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.271607 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02e6583b-2f8d-4d50-9146-65c3f281283e","Type":"ContainerStarted","Data":"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3"} Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.273355 4750 generic.go:334] "Generic (PLEG): container finished" podID="6c616741-8a43-456e-a249-aee7e4d3764f" containerID="0ba5b8a05e8f46a8ac74a1a7956f2c4cec61b825654ab0658e1c4de2d90cdd8b" exitCode=0 Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.273505 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" event={"ID":"6c616741-8a43-456e-a249-aee7e4d3764f","Type":"ContainerDied","Data":"0ba5b8a05e8f46a8ac74a1a7956f2c4cec61b825654ab0658e1c4de2d90cdd8b"} Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.316685 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.316660239 podStartE2EDuration="2.316660239s" podCreationTimestamp="2025-10-08 18:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:32.29627959 +0000 UTC m=+1248.209250653" watchObservedRunningTime="2025-10-08 18:31:32.316660239 +0000 UTC m=+1248.229631272" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.669530 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.788857 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-scripts\") pod \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.788900 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-combined-ca-bundle\") pod \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.788972 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-config-data\") pod \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.789088 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5tv\" (UniqueName: \"kubernetes.io/projected/1f613fe6-8980-4ded-8c2f-c4222c597cf1-kube-api-access-mg5tv\") pod \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\" (UID: \"1f613fe6-8980-4ded-8c2f-c4222c597cf1\") " Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.794453 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f613fe6-8980-4ded-8c2f-c4222c597cf1-kube-api-access-mg5tv" (OuterVolumeSpecName: "kube-api-access-mg5tv") pod "1f613fe6-8980-4ded-8c2f-c4222c597cf1" (UID: "1f613fe6-8980-4ded-8c2f-c4222c597cf1"). InnerVolumeSpecName "kube-api-access-mg5tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.795711 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-scripts" (OuterVolumeSpecName: "scripts") pod "1f613fe6-8980-4ded-8c2f-c4222c597cf1" (UID: "1f613fe6-8980-4ded-8c2f-c4222c597cf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.816519 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f613fe6-8980-4ded-8c2f-c4222c597cf1" (UID: "1f613fe6-8980-4ded-8c2f-c4222c597cf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.824346 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-config-data" (OuterVolumeSpecName: "config-data") pod "1f613fe6-8980-4ded-8c2f-c4222c597cf1" (UID: "1f613fe6-8980-4ded-8c2f-c4222c597cf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.891726 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.891756 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5tv\" (UniqueName: \"kubernetes.io/projected/1f613fe6-8980-4ded-8c2f-c4222c597cf1-kube-api-access-mg5tv\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.891765 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:32 crc kubenswrapper[4750]: I1008 18:31:32.891773 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f613fe6-8980-4ded-8c2f-c4222c597cf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.286618 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ftcwm" event={"ID":"1f613fe6-8980-4ded-8c2f-c4222c597cf1","Type":"ContainerDied","Data":"2201a273fa8ec2a7be98c0582117e5f6bf636bef133a5b367ee175447ea57b52"} Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.286697 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2201a273fa8ec2a7be98c0582117e5f6bf636bef133a5b367ee175447ea57b52" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.286754 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ftcwm" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.469639 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.470214 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4460ef21-2426-4fc2-bba3-147fdd612a0c" containerName="nova-scheduler-scheduler" containerID="cri-o://7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7" gracePeriod=30 Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.490482 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.490707 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerName="nova-api-log" containerID="cri-o://29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f" gracePeriod=30 Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.490813 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerName="nova-api-api" containerID="cri-o://861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f" gracePeriod=30 Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.513864 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.638965 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.756751 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.764482 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.821878 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.829350 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-qsxts"] Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.844800 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" podUID="93470753-915e-4675-9ecc-6942de332cd4" containerName="dnsmasq-dns" containerID="cri-o://e491d282a1964ea6486700cedfdc02fe6e80d07443042915a7c5898b1fe1097c" gracePeriod=10 Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.909915 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-combined-ca-bundle\") pod \"6c616741-8a43-456e-a249-aee7e4d3764f\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.910010 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-scripts\") pod \"6c616741-8a43-456e-a249-aee7e4d3764f\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.910032 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-config-data\") pod \"6c616741-8a43-456e-a249-aee7e4d3764f\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.910181 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p66pl\" (UniqueName: \"kubernetes.io/projected/6c616741-8a43-456e-a249-aee7e4d3764f-kube-api-access-p66pl\") pod \"6c616741-8a43-456e-a249-aee7e4d3764f\" (UID: \"6c616741-8a43-456e-a249-aee7e4d3764f\") " Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.915645 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-scripts" (OuterVolumeSpecName: "scripts") pod "6c616741-8a43-456e-a249-aee7e4d3764f" (UID: "6c616741-8a43-456e-a249-aee7e4d3764f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.915688 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c616741-8a43-456e-a249-aee7e4d3764f-kube-api-access-p66pl" (OuterVolumeSpecName: "kube-api-access-p66pl") pod "6c616741-8a43-456e-a249-aee7e4d3764f" (UID: "6c616741-8a43-456e-a249-aee7e4d3764f"). InnerVolumeSpecName "kube-api-access-p66pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.945577 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c616741-8a43-456e-a249-aee7e4d3764f" (UID: "6c616741-8a43-456e-a249-aee7e4d3764f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:33 crc kubenswrapper[4750]: I1008 18:31:33.964775 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-config-data" (OuterVolumeSpecName: "config-data") pod "6c616741-8a43-456e-a249-aee7e4d3764f" (UID: "6c616741-8a43-456e-a249-aee7e4d3764f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.014108 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.014139 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.014148 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c616741-8a43-456e-a249-aee7e4d3764f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.014157 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p66pl\" (UniqueName: \"kubernetes.io/projected/6c616741-8a43-456e-a249-aee7e4d3764f-kube-api-access-p66pl\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.033595 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.216791 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-combined-ca-bundle\") pod \"90f77db6-aa1c-4d0c-8598-51de62f090d5\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.216929 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-config-data\") pod \"90f77db6-aa1c-4d0c-8598-51de62f090d5\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.217007 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8prhb\" (UniqueName: \"kubernetes.io/projected/90f77db6-aa1c-4d0c-8598-51de62f090d5-kube-api-access-8prhb\") pod \"90f77db6-aa1c-4d0c-8598-51de62f090d5\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.217079 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90f77db6-aa1c-4d0c-8598-51de62f090d5-logs\") pod \"90f77db6-aa1c-4d0c-8598-51de62f090d5\" (UID: \"90f77db6-aa1c-4d0c-8598-51de62f090d5\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.217675 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90f77db6-aa1c-4d0c-8598-51de62f090d5-logs" (OuterVolumeSpecName: "logs") pod "90f77db6-aa1c-4d0c-8598-51de62f090d5" (UID: "90f77db6-aa1c-4d0c-8598-51de62f090d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.223106 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f77db6-aa1c-4d0c-8598-51de62f090d5-kube-api-access-8prhb" (OuterVolumeSpecName: "kube-api-access-8prhb") pod "90f77db6-aa1c-4d0c-8598-51de62f090d5" (UID: "90f77db6-aa1c-4d0c-8598-51de62f090d5"). InnerVolumeSpecName "kube-api-access-8prhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.241325 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90f77db6-aa1c-4d0c-8598-51de62f090d5" (UID: "90f77db6-aa1c-4d0c-8598-51de62f090d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.253854 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-config-data" (OuterVolumeSpecName: "config-data") pod "90f77db6-aa1c-4d0c-8598-51de62f090d5" (UID: "90f77db6-aa1c-4d0c-8598-51de62f090d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.296534 4750 generic.go:334] "Generic (PLEG): container finished" podID="93470753-915e-4675-9ecc-6942de332cd4" containerID="e491d282a1964ea6486700cedfdc02fe6e80d07443042915a7c5898b1fe1097c" exitCode=0 Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.296620 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" event={"ID":"93470753-915e-4675-9ecc-6942de332cd4","Type":"ContainerDied","Data":"e491d282a1964ea6486700cedfdc02fe6e80d07443042915a7c5898b1fe1097c"} Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.298743 4750 generic.go:334] "Generic (PLEG): container finished" podID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerID="861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f" exitCode=0 Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.298765 4750 generic.go:334] "Generic (PLEG): container finished" podID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerID="29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f" exitCode=143 Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.298814 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90f77db6-aa1c-4d0c-8598-51de62f090d5","Type":"ContainerDied","Data":"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f"} Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.298844 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.298924 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90f77db6-aa1c-4d0c-8598-51de62f090d5","Type":"ContainerDied","Data":"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f"} Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.298950 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90f77db6-aa1c-4d0c-8598-51de62f090d5","Type":"ContainerDied","Data":"7a80924022559a90007625acbeb30d74a2105b59f60b48e3dd488328b27d5b85"} Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.298954 4750 scope.go:117] "RemoveContainer" containerID="861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.301951 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerName="nova-metadata-log" containerID="cri-o://0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3" gracePeriod=30 Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.302507 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.302573 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerName="nova-metadata-metadata" containerID="cri-o://d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe" gracePeriod=30 Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.302699 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kxxb9" event={"ID":"6c616741-8a43-456e-a249-aee7e4d3764f","Type":"ContainerDied","Data":"696e58c930f2a4eb34503f3f66372c56fe5d870dfc6f0e1bfede49cd63181314"} Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.302734 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="696e58c930f2a4eb34503f3f66372c56fe5d870dfc6f0e1bfede49cd63181314" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.312313 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.319171 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.319205 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8prhb\" (UniqueName: \"kubernetes.io/projected/90f77db6-aa1c-4d0c-8598-51de62f090d5-kube-api-access-8prhb\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.319219 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90f77db6-aa1c-4d0c-8598-51de62f090d5-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.319232 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f77db6-aa1c-4d0c-8598-51de62f090d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.322820 4750 scope.go:117] "RemoveContainer" containerID="29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.383915 4750 scope.go:117] "RemoveContainer" containerID="861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f" Oct 08 18:31:34 crc kubenswrapper[4750]: E1008 18:31:34.387030 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f\": container with ID starting with 861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f not found: ID does not exist" containerID="861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.387072 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f"} err="failed to get container status \"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f\": rpc error: code = NotFound desc = could not find container \"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f\": container with ID starting with 861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f not found: ID does not exist" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.387129 4750 scope.go:117] "RemoveContainer" containerID="29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f" Oct 08 18:31:34 crc kubenswrapper[4750]: E1008 18:31:34.399171 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f\": container with ID starting with 29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f not found: ID does not exist" containerID="29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.399218 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f"} err="failed to get container status \"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f\": rpc error: code = NotFound desc = could not find container \"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f\": container with ID starting with 29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f not found: ID does not exist" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.399265 4750 scope.go:117] "RemoveContainer" containerID="861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.400085 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f"} err="failed to get container status \"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f\": rpc error: code = NotFound desc = could not find container \"861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f\": container with ID starting with 861166aace877e8169d1bf56dc7962b13a10be5206cb6ddc4af53ef8b1d2664f not found: ID does not exist" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.400108 4750 scope.go:117] "RemoveContainer" containerID="29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.400361 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f"} err="failed to get container status \"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f\": rpc error: code = NotFound desc = could not find container \"29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f\": container with ID starting with 29220343c3824ad846ea0a14811432ed315928ed7631d48843ec7b6a2209798f not found: ID does not exist" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.421502 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-nb\") pod \"93470753-915e-4675-9ecc-6942de332cd4\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.421564 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gmvl\" (UniqueName: \"kubernetes.io/projected/93470753-915e-4675-9ecc-6942de332cd4-kube-api-access-6gmvl\") pod \"93470753-915e-4675-9ecc-6942de332cd4\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.421616 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-sb\") pod \"93470753-915e-4675-9ecc-6942de332cd4\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.421697 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-swift-storage-0\") pod \"93470753-915e-4675-9ecc-6942de332cd4\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.421722 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-svc\") pod \"93470753-915e-4675-9ecc-6942de332cd4\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.421760 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-config\") pod \"93470753-915e-4675-9ecc-6942de332cd4\" (UID: \"93470753-915e-4675-9ecc-6942de332cd4\") " Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.433836 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.445383 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93470753-915e-4675-9ecc-6942de332cd4-kube-api-access-6gmvl" (OuterVolumeSpecName: "kube-api-access-6gmvl") pod "93470753-915e-4675-9ecc-6942de332cd4" (UID: "93470753-915e-4675-9ecc-6942de332cd4"). InnerVolumeSpecName "kube-api-access-6gmvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.451361 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.458732 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 18:31:34 crc kubenswrapper[4750]: E1008 18:31:34.463941 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93470753-915e-4675-9ecc-6942de332cd4" containerName="init" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.463992 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="93470753-915e-4675-9ecc-6942de332cd4" containerName="init" Oct 08 18:31:34 crc kubenswrapper[4750]: E1008 18:31:34.464009 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f613fe6-8980-4ded-8c2f-c4222c597cf1" containerName="nova-manage" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464017 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f613fe6-8980-4ded-8c2f-c4222c597cf1" containerName="nova-manage" Oct 08 18:31:34 crc kubenswrapper[4750]: E1008 18:31:34.464031 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerName="nova-api-api" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464040 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerName="nova-api-api" Oct 08 18:31:34 crc kubenswrapper[4750]: E1008 18:31:34.464055 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerName="nova-api-log" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464062 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerName="nova-api-log" Oct 08 18:31:34 crc kubenswrapper[4750]: E1008 18:31:34.464074 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93470753-915e-4675-9ecc-6942de332cd4" containerName="dnsmasq-dns" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464081 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="93470753-915e-4675-9ecc-6942de332cd4" containerName="dnsmasq-dns" Oct 08 18:31:34 crc kubenswrapper[4750]: E1008 18:31:34.464108 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c616741-8a43-456e-a249-aee7e4d3764f" containerName="nova-cell1-conductor-db-sync" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464116 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c616741-8a43-456e-a249-aee7e4d3764f" containerName="nova-cell1-conductor-db-sync" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464406 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="93470753-915e-4675-9ecc-6942de332cd4" containerName="dnsmasq-dns" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464471 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerName="nova-api-api" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464498 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c616741-8a43-456e-a249-aee7e4d3764f" containerName="nova-cell1-conductor-db-sync" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464514 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f613fe6-8980-4ded-8c2f-c4222c597cf1" containerName="nova-manage" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.464531 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" containerName="nova-api-log" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.465578 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.475644 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.479617 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.482758 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.484622 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.496674 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-config" (OuterVolumeSpecName: "config") pod "93470753-915e-4675-9ecc-6942de332cd4" (UID: "93470753-915e-4675-9ecc-6942de332cd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.499293 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.499328 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93470753-915e-4675-9ecc-6942de332cd4" (UID: "93470753-915e-4675-9ecc-6942de332cd4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.512286 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.515334 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93470753-915e-4675-9ecc-6942de332cd4" (UID: "93470753-915e-4675-9ecc-6942de332cd4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.518242 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93470753-915e-4675-9ecc-6942de332cd4" (UID: "93470753-915e-4675-9ecc-6942de332cd4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.523455 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93470753-915e-4675-9ecc-6942de332cd4" (UID: "93470753-915e-4675-9ecc-6942de332cd4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.525060 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.525129 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.525142 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.525167 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.525179 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gmvl\" (UniqueName: \"kubernetes.io/projected/93470753-915e-4675-9ecc-6942de332cd4-kube-api-access-6gmvl\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.525191 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93470753-915e-4675-9ecc-6942de332cd4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.626469 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vctmd\" (UniqueName: \"kubernetes.io/projected/85db2d5f-53d2-4875-b334-8b3689922391-kube-api-access-vctmd\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.626520 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.626584 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.626631 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnpm7\" (UniqueName: \"kubernetes.io/projected/af09729b-3284-4dcd-91a1-5763d28daaf5-kube-api-access-qnpm7\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.626656 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-config-data\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.626686 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85db2d5f-53d2-4875-b334-8b3689922391-logs\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.626714 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.729206 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vctmd\" (UniqueName: \"kubernetes.io/projected/85db2d5f-53d2-4875-b334-8b3689922391-kube-api-access-vctmd\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.729267 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.729337 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.729409 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnpm7\" (UniqueName: \"kubernetes.io/projected/af09729b-3284-4dcd-91a1-5763d28daaf5-kube-api-access-qnpm7\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.729438 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-config-data\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.729496 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85db2d5f-53d2-4875-b334-8b3689922391-logs\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.729544 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.730388 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85db2d5f-53d2-4875-b334-8b3689922391-logs\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.733531 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.735152 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.736439 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-config-data\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.741965 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.747020 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f77db6-aa1c-4d0c-8598-51de62f090d5" path="/var/lib/kubelet/pods/90f77db6-aa1c-4d0c-8598-51de62f090d5/volumes" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.748179 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnpm7\" (UniqueName: \"kubernetes.io/projected/af09729b-3284-4dcd-91a1-5763d28daaf5-kube-api-access-qnpm7\") pod \"nova-cell1-conductor-0\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.750284 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vctmd\" (UniqueName: \"kubernetes.io/projected/85db2d5f-53d2-4875-b334-8b3689922391-kube-api-access-vctmd\") pod \"nova-api-0\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.843511 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.848543 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:31:34 crc kubenswrapper[4750]: I1008 18:31:34.937941 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.034080 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cccj2\" (UniqueName: \"kubernetes.io/projected/02e6583b-2f8d-4d50-9146-65c3f281283e-kube-api-access-cccj2\") pod \"02e6583b-2f8d-4d50-9146-65c3f281283e\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.034127 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-config-data\") pod \"02e6583b-2f8d-4d50-9146-65c3f281283e\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.034208 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-combined-ca-bundle\") pod \"02e6583b-2f8d-4d50-9146-65c3f281283e\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.034257 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-nova-metadata-tls-certs\") pod \"02e6583b-2f8d-4d50-9146-65c3f281283e\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.034320 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e6583b-2f8d-4d50-9146-65c3f281283e-logs\") pod \"02e6583b-2f8d-4d50-9146-65c3f281283e\" (UID: \"02e6583b-2f8d-4d50-9146-65c3f281283e\") " Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.035571 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e6583b-2f8d-4d50-9146-65c3f281283e-logs" (OuterVolumeSpecName: "logs") pod "02e6583b-2f8d-4d50-9146-65c3f281283e" (UID: "02e6583b-2f8d-4d50-9146-65c3f281283e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.043501 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e6583b-2f8d-4d50-9146-65c3f281283e-kube-api-access-cccj2" (OuterVolumeSpecName: "kube-api-access-cccj2") pod "02e6583b-2f8d-4d50-9146-65c3f281283e" (UID: "02e6583b-2f8d-4d50-9146-65c3f281283e"). InnerVolumeSpecName "kube-api-access-cccj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.062365 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-config-data" (OuterVolumeSpecName: "config-data") pod "02e6583b-2f8d-4d50-9146-65c3f281283e" (UID: "02e6583b-2f8d-4d50-9146-65c3f281283e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.068029 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02e6583b-2f8d-4d50-9146-65c3f281283e" (UID: "02e6583b-2f8d-4d50-9146-65c3f281283e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.090973 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "02e6583b-2f8d-4d50-9146-65c3f281283e" (UID: "02e6583b-2f8d-4d50-9146-65c3f281283e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.136861 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02e6583b-2f8d-4d50-9146-65c3f281283e-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.137342 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cccj2\" (UniqueName: \"kubernetes.io/projected/02e6583b-2f8d-4d50-9146-65c3f281283e-kube-api-access-cccj2\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.137364 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.137373 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.137384 4750 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/02e6583b-2f8d-4d50-9146-65c3f281283e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.315947 4750 generic.go:334] "Generic (PLEG): container finished" podID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerID="d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe" exitCode=0 Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.315986 4750 generic.go:334] "Generic (PLEG): container finished" podID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerID="0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3" exitCode=143 Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.316031 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02e6583b-2f8d-4d50-9146-65c3f281283e","Type":"ContainerDied","Data":"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe"} Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.316063 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02e6583b-2f8d-4d50-9146-65c3f281283e","Type":"ContainerDied","Data":"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3"} Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.316077 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"02e6583b-2f8d-4d50-9146-65c3f281283e","Type":"ContainerDied","Data":"abbdfaa6ba52a34b12503584de6cca216a3b10157bf5703271d6330a9a81eac1"} Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.316094 4750 scope.go:117] "RemoveContainer" containerID="d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.316189 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.322851 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" event={"ID":"93470753-915e-4675-9ecc-6942de332cd4","Type":"ContainerDied","Data":"4e29f45e32330d153539c27dcf19fd6eeb652daa9b10aad6cd8dbf6c716c6eb3"} Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.322986 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bd785c49-qsxts" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.356104 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.361685 4750 scope.go:117] "RemoveContainer" containerID="0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.384779 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.399660 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.423724 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-qsxts"] Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.423790 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bd785c49-qsxts"] Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.430995 4750 scope.go:117] "RemoveContainer" containerID="d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.431919 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:35 crc kubenswrapper[4750]: E1008 18:31:35.432276 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerName="nova-metadata-metadata" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.432293 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerName="nova-metadata-metadata" Oct 08 18:31:35 crc kubenswrapper[4750]: E1008 18:31:35.432334 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerName="nova-metadata-log" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.432341 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerName="nova-metadata-log" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.432503 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerName="nova-metadata-metadata" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.432716 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" containerName="nova-metadata-log" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.433669 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.438266 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 18:31:35 crc kubenswrapper[4750]: E1008 18:31:35.441911 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe\": container with ID starting with d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe not found: ID does not exist" containerID="d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.441956 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe"} err="failed to get container status \"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe\": rpc error: code = NotFound desc = could not find container \"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe\": container with ID starting with d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe not found: ID does not exist" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.441985 4750 scope.go:117] "RemoveContainer" containerID="0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.448758 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 18:31:35 crc kubenswrapper[4750]: E1008 18:31:35.449104 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3\": container with ID starting with 0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3 not found: ID does not exist" containerID="0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.449136 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3"} err="failed to get container status \"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3\": rpc error: code = NotFound desc = could not find container \"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3\": container with ID starting with 0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3 not found: ID does not exist" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.449168 4750 scope.go:117] "RemoveContainer" containerID="d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.468660 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe"} err="failed to get container status \"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe\": rpc error: code = NotFound desc = could not find container \"d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe\": container with ID starting with d652e2b03305677ef44b919b7d42df2ab33acd8b817dcd5ce57ada13c119affe not found: ID does not exist" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.468699 4750 scope.go:117] "RemoveContainer" containerID="0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.478987 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3"} err="failed to get container status \"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3\": rpc error: code = NotFound desc = could not find container \"0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3\": container with ID starting with 0f0f16b89b3bf19ab974afdcdb07075bbdd7ea04f744c9988f331e7fdc1317a3 not found: ID does not exist" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.479021 4750 scope.go:117] "RemoveContainer" containerID="e491d282a1964ea6486700cedfdc02fe6e80d07443042915a7c5898b1fe1097c" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.485670 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.546607 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.546711 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.546742 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-logs\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.546763 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvzh\" (UniqueName: \"kubernetes.io/projected/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-kube-api-access-qlvzh\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.546792 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-config-data\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.547426 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.636416 4750 scope.go:117] "RemoveContainer" containerID="1ebda8fdacd7b2332560b60b9863fe605e2cd281473db07a2fc870cc4e3902b5" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.652354 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.652407 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-logs\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.652430 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvzh\" (UniqueName: \"kubernetes.io/projected/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-kube-api-access-qlvzh\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.652456 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-config-data\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.652542 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.653540 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-logs\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.656451 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.656967 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.661328 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-config-data\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.673214 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvzh\" (UniqueName: \"kubernetes.io/projected/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-kube-api-access-qlvzh\") pod \"nova-metadata-0\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.866034 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:31:35 crc kubenswrapper[4750]: I1008 18:31:35.911410 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.058468 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcc5w\" (UniqueName: \"kubernetes.io/projected/4460ef21-2426-4fc2-bba3-147fdd612a0c-kube-api-access-kcc5w\") pod \"4460ef21-2426-4fc2-bba3-147fdd612a0c\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.058940 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-config-data\") pod \"4460ef21-2426-4fc2-bba3-147fdd612a0c\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.058997 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-combined-ca-bundle\") pod \"4460ef21-2426-4fc2-bba3-147fdd612a0c\" (UID: \"4460ef21-2426-4fc2-bba3-147fdd612a0c\") " Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.062983 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4460ef21-2426-4fc2-bba3-147fdd612a0c-kube-api-access-kcc5w" (OuterVolumeSpecName: "kube-api-access-kcc5w") pod "4460ef21-2426-4fc2-bba3-147fdd612a0c" (UID: "4460ef21-2426-4fc2-bba3-147fdd612a0c"). InnerVolumeSpecName "kube-api-access-kcc5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.103107 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-config-data" (OuterVolumeSpecName: "config-data") pod "4460ef21-2426-4fc2-bba3-147fdd612a0c" (UID: "4460ef21-2426-4fc2-bba3-147fdd612a0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.104042 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4460ef21-2426-4fc2-bba3-147fdd612a0c" (UID: "4460ef21-2426-4fc2-bba3-147fdd612a0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.161260 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcc5w\" (UniqueName: \"kubernetes.io/projected/4460ef21-2426-4fc2-bba3-147fdd612a0c-kube-api-access-kcc5w\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.161297 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.161310 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4460ef21-2426-4fc2-bba3-147fdd612a0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.294468 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.316581 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:31:36 crc kubenswrapper[4750]: W1008 18:31:36.318087 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df3c5c3_627e_4bfb_9dd0_16976cc25f3d.slice/crio-dbfc4f38b84ecd9ecc42d498baaaf06774e43287d385c11988214c2ae759476a WatchSource:0}: Error finding container dbfc4f38b84ecd9ecc42d498baaaf06774e43287d385c11988214c2ae759476a: Status 404 returned error can't find the container with id dbfc4f38b84ecd9ecc42d498baaaf06774e43287d385c11988214c2ae759476a Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.347511 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d","Type":"ContainerStarted","Data":"dbfc4f38b84ecd9ecc42d498baaaf06774e43287d385c11988214c2ae759476a"} Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.348865 4750 generic.go:334] "Generic (PLEG): container finished" podID="4460ef21-2426-4fc2-bba3-147fdd612a0c" containerID="7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7" exitCode=0 Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.348906 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4460ef21-2426-4fc2-bba3-147fdd612a0c","Type":"ContainerDied","Data":"7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7"} Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.348929 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4460ef21-2426-4fc2-bba3-147fdd612a0c","Type":"ContainerDied","Data":"5d9d23d7d6037a4011884b0e12bf635fd88493b701923002ffce41e007d31d3b"} Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.348945 4750 scope.go:117] "RemoveContainer" containerID="7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.348955 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.354054 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85db2d5f-53d2-4875-b334-8b3689922391","Type":"ContainerStarted","Data":"f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6"} Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.354089 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85db2d5f-53d2-4875-b334-8b3689922391","Type":"ContainerStarted","Data":"fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7"} Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.354099 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85db2d5f-53d2-4875-b334-8b3689922391","Type":"ContainerStarted","Data":"8c1445e38159c1fb76ae461a1d9e4a6d90e6e3ab9a816965938ba2a5322739af"} Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.378204 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.378187451 podStartE2EDuration="2.378187451s" podCreationTimestamp="2025-10-08 18:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:36.376455169 +0000 UTC m=+1252.289426202" watchObservedRunningTime="2025-10-08 18:31:36.378187451 +0000 UTC m=+1252.291158464" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.386257 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af09729b-3284-4dcd-91a1-5763d28daaf5","Type":"ContainerStarted","Data":"c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839"} Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.386308 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af09729b-3284-4dcd-91a1-5763d28daaf5","Type":"ContainerStarted","Data":"5ef135a41ae873ae93db09009b8373bbb16fc3f1f05dbfb10390bcd13e9c641b"} Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.387486 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.405819 4750 scope.go:117] "RemoveContainer" containerID="7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7" Oct 08 18:31:36 crc kubenswrapper[4750]: E1008 18:31:36.406294 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7\": container with ID starting with 7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7 not found: ID does not exist" containerID="7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.406333 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7"} err="failed to get container status \"7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7\": rpc error: code = NotFound desc = could not find container \"7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7\": container with ID starting with 7fe5d92f744690c60f4ac91791fe5b26e5406da0338f6de8b129cdadf8f86cc7 not found: ID does not exist" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.426906 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.443747 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.450376 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.450360156 podStartE2EDuration="2.450360156s" podCreationTimestamp="2025-10-08 18:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:36.407453607 +0000 UTC m=+1252.320424630" watchObservedRunningTime="2025-10-08 18:31:36.450360156 +0000 UTC m=+1252.363331169" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.457694 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:36 crc kubenswrapper[4750]: E1008 18:31:36.458222 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4460ef21-2426-4fc2-bba3-147fdd612a0c" containerName="nova-scheduler-scheduler" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.458252 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4460ef21-2426-4fc2-bba3-147fdd612a0c" containerName="nova-scheduler-scheduler" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.458516 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4460ef21-2426-4fc2-bba3-147fdd612a0c" containerName="nova-scheduler-scheduler" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.459315 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.461448 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.465222 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.571991 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-config-data\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.572085 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.572293 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxkr5\" (UniqueName: \"kubernetes.io/projected/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-kube-api-access-xxkr5\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.674151 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-config-data\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.675474 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.675591 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxkr5\" (UniqueName: \"kubernetes.io/projected/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-kube-api-access-xxkr5\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.682856 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.682953 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-config-data\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.690448 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxkr5\" (UniqueName: \"kubernetes.io/projected/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-kube-api-access-xxkr5\") pod \"nova-scheduler-0\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " pod="openstack/nova-scheduler-0" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.746780 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e6583b-2f8d-4d50-9146-65c3f281283e" path="/var/lib/kubelet/pods/02e6583b-2f8d-4d50-9146-65c3f281283e/volumes" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.747916 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4460ef21-2426-4fc2-bba3-147fdd612a0c" path="/var/lib/kubelet/pods/4460ef21-2426-4fc2-bba3-147fdd612a0c/volumes" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.748652 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93470753-915e-4675-9ecc-6942de332cd4" path="/var/lib/kubelet/pods/93470753-915e-4675-9ecc-6942de332cd4/volumes" Oct 08 18:31:36 crc kubenswrapper[4750]: I1008 18:31:36.785053 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:31:37 crc kubenswrapper[4750]: I1008 18:31:37.198475 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:31:37 crc kubenswrapper[4750]: I1008 18:31:37.398339 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d","Type":"ContainerStarted","Data":"a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd"} Oct 08 18:31:37 crc kubenswrapper[4750]: I1008 18:31:37.398381 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d","Type":"ContainerStarted","Data":"d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc"} Oct 08 18:31:37 crc kubenswrapper[4750]: I1008 18:31:37.403537 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15c75a6e-1d37-4345-a579-62cb0ac8c3fe","Type":"ContainerStarted","Data":"9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551"} Oct 08 18:31:37 crc kubenswrapper[4750]: I1008 18:31:37.403586 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15c75a6e-1d37-4345-a579-62cb0ac8c3fe","Type":"ContainerStarted","Data":"6bd4bcd958ffecb7a4c0fbf6a172399e254402b278f8c9052cbea728cb7c88d9"} Oct 08 18:31:37 crc kubenswrapper[4750]: I1008 18:31:37.434177 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.434157725 podStartE2EDuration="1.434157725s" podCreationTimestamp="2025-10-08 18:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:37.428843795 +0000 UTC m=+1253.341814828" watchObservedRunningTime="2025-10-08 18:31:37.434157725 +0000 UTC m=+1253.347128738" Oct 08 18:31:37 crc kubenswrapper[4750]: I1008 18:31:37.434805 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4347969 podStartE2EDuration="2.4347969s" podCreationTimestamp="2025-10-08 18:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:37.415060238 +0000 UTC m=+1253.328031251" watchObservedRunningTime="2025-10-08 18:31:37.4347969 +0000 UTC m=+1253.347767913" Oct 08 18:31:39 crc kubenswrapper[4750]: I1008 18:31:39.524343 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:31:39 crc kubenswrapper[4750]: I1008 18:31:39.525095 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e" containerName="kube-state-metrics" containerID="cri-o://aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49" gracePeriod=30 Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.013849 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.144146 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz2tm\" (UniqueName: \"kubernetes.io/projected/9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e-kube-api-access-wz2tm\") pod \"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e\" (UID: \"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e\") " Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.149915 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e-kube-api-access-wz2tm" (OuterVolumeSpecName: "kube-api-access-wz2tm") pod "9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e" (UID: "9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e"). InnerVolumeSpecName "kube-api-access-wz2tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.247032 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz2tm\" (UniqueName: \"kubernetes.io/projected/9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e-kube-api-access-wz2tm\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.428086 4750 generic.go:334] "Generic (PLEG): container finished" podID="9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e" containerID="aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49" exitCode=2 Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.428128 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e","Type":"ContainerDied","Data":"aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49"} Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.428160 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e","Type":"ContainerDied","Data":"92c19aa0fc4019d8868e8a01e9e8729c234e58647563aa88315e7b75a63d334c"} Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.428176 4750 scope.go:117] "RemoveContainer" containerID="aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.428300 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.466404 4750 scope.go:117] "RemoveContainer" containerID="aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49" Oct 08 18:31:40 crc kubenswrapper[4750]: E1008 18:31:40.467490 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49\": container with ID starting with aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49 not found: ID does not exist" containerID="aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.467522 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49"} err="failed to get container status \"aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49\": rpc error: code = NotFound desc = could not find container \"aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49\": container with ID starting with aff78b03ad9b7af1c0ed4fcc7ecb7e13bec75ee5635ed6c1396b62f06bda2e49 not found: ID does not exist" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.470613 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.484629 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.498930 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:31:40 crc kubenswrapper[4750]: E1008 18:31:40.499522 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e" containerName="kube-state-metrics" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.499536 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e" containerName="kube-state-metrics" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.499754 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e" containerName="kube-state-metrics" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.501073 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.503094 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.503260 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.515322 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.551635 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.551713 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbjzw\" (UniqueName: \"kubernetes.io/projected/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-api-access-mbjzw\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.551745 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.551811 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.653795 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.654091 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbjzw\" (UniqueName: \"kubernetes.io/projected/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-api-access-mbjzw\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.654196 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.654307 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.657392 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.658300 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.670651 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.671363 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbjzw\" (UniqueName: \"kubernetes.io/projected/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-api-access-mbjzw\") pod \"kube-state-metrics-0\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.760312 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e" path="/var/lib/kubelet/pods/9f1aa3e0-67d1-4e4d-b162-b21c9f185d3e/volumes" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.828389 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.867183 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 18:31:40 crc kubenswrapper[4750]: I1008 18:31:40.868365 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.217906 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.218517 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="ceilometer-central-agent" containerID="cri-o://1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9" gracePeriod=30 Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.218665 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="proxy-httpd" containerID="cri-o://1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051" gracePeriod=30 Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.218718 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="sg-core" containerID="cri-o://3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27" gracePeriod=30 Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.218752 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="ceilometer-notification-agent" containerID="cri-o://148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e" gracePeriod=30 Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.318349 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:31:41 crc kubenswrapper[4750]: W1008 18:31:41.320013 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac899792_9b55_4ae1_b9f2_24d8bb4ebb2f.slice/crio-831ad7f019599efb4d7fd03244f36bc634ee3d8556b35361f0d0cf6685fe1016 WatchSource:0}: Error finding container 831ad7f019599efb4d7fd03244f36bc634ee3d8556b35361f0d0cf6685fe1016: Status 404 returned error can't find the container with id 831ad7f019599efb4d7fd03244f36bc634ee3d8556b35361f0d0cf6685fe1016 Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.439786 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f","Type":"ContainerStarted","Data":"831ad7f019599efb4d7fd03244f36bc634ee3d8556b35361f0d0cf6685fe1016"} Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.442337 4750 generic.go:334] "Generic (PLEG): container finished" podID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerID="1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051" exitCode=0 Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.442444 4750 generic.go:334] "Generic (PLEG): container finished" podID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerID="3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27" exitCode=2 Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.442419 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerDied","Data":"1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051"} Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.442542 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerDied","Data":"3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27"} Oct 08 18:31:41 crc kubenswrapper[4750]: I1008 18:31:41.785666 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 18:31:42 crc kubenswrapper[4750]: I1008 18:31:42.454418 4750 generic.go:334] "Generic (PLEG): container finished" podID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerID="1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9" exitCode=0 Oct 08 18:31:42 crc kubenswrapper[4750]: I1008 18:31:42.454506 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerDied","Data":"1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9"} Oct 08 18:31:43 crc kubenswrapper[4750]: I1008 18:31:43.466303 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f","Type":"ContainerStarted","Data":"2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e"} Oct 08 18:31:43 crc kubenswrapper[4750]: I1008 18:31:43.466660 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 18:31:43 crc kubenswrapper[4750]: I1008 18:31:43.486688 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.095763341 podStartE2EDuration="3.486671957s" podCreationTimestamp="2025-10-08 18:31:40 +0000 UTC" firstStartedPulling="2025-10-08 18:31:41.32213073 +0000 UTC m=+1257.235101743" lastFinishedPulling="2025-10-08 18:31:42.713039346 +0000 UTC m=+1258.626010359" observedRunningTime="2025-10-08 18:31:43.4822849 +0000 UTC m=+1259.395255923" watchObservedRunningTime="2025-10-08 18:31:43.486671957 +0000 UTC m=+1259.399642960" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.302757 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.465021 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-config-data\") pod \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.465092 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-run-httpd\") pod \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.465145 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-scripts\") pod \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.465175 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-log-httpd\") pod \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.465202 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rkgg\" (UniqueName: \"kubernetes.io/projected/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-kube-api-access-4rkgg\") pod \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.465261 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-combined-ca-bundle\") pod \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.465365 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-sg-core-conf-yaml\") pod \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\" (UID: \"1b9dae8f-58c7-4841-82f8-9cb0b37af29b\") " Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.465723 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b9dae8f-58c7-4841-82f8-9cb0b37af29b" (UID: "1b9dae8f-58c7-4841-82f8-9cb0b37af29b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.466264 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b9dae8f-58c7-4841-82f8-9cb0b37af29b" (UID: "1b9dae8f-58c7-4841-82f8-9cb0b37af29b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.471084 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-scripts" (OuterVolumeSpecName: "scripts") pod "1b9dae8f-58c7-4841-82f8-9cb0b37af29b" (UID: "1b9dae8f-58c7-4841-82f8-9cb0b37af29b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.473331 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-kube-api-access-4rkgg" (OuterVolumeSpecName: "kube-api-access-4rkgg") pod "1b9dae8f-58c7-4841-82f8-9cb0b37af29b" (UID: "1b9dae8f-58c7-4841-82f8-9cb0b37af29b"). InnerVolumeSpecName "kube-api-access-4rkgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.476359 4750 generic.go:334] "Generic (PLEG): container finished" podID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerID="148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e" exitCode=0 Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.476433 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerDied","Data":"148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e"} Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.476468 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b9dae8f-58c7-4841-82f8-9cb0b37af29b","Type":"ContainerDied","Data":"8f85c44d2f99d5d5093d0b82130fe74e10e5fb6edd9e311e77cc6bc35bf5e331"} Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.476477 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.476488 4750 scope.go:117] "RemoveContainer" containerID="1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.507364 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b9dae8f-58c7-4841-82f8-9cb0b37af29b" (UID: "1b9dae8f-58c7-4841-82f8-9cb0b37af29b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.561648 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b9dae8f-58c7-4841-82f8-9cb0b37af29b" (UID: "1b9dae8f-58c7-4841-82f8-9cb0b37af29b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.568050 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.568234 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.568343 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.568432 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rkgg\" (UniqueName: \"kubernetes.io/projected/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-kube-api-access-4rkgg\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.568512 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.568624 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.571179 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-config-data" (OuterVolumeSpecName: "config-data") pod "1b9dae8f-58c7-4841-82f8-9cb0b37af29b" (UID: "1b9dae8f-58c7-4841-82f8-9cb0b37af29b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.574564 4750 scope.go:117] "RemoveContainer" containerID="3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.603787 4750 scope.go:117] "RemoveContainer" containerID="148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.624143 4750 scope.go:117] "RemoveContainer" containerID="1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.650249 4750 scope.go:117] "RemoveContainer" containerID="1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051" Oct 08 18:31:44 crc kubenswrapper[4750]: E1008 18:31:44.650874 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051\": container with ID starting with 1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051 not found: ID does not exist" containerID="1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.650979 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051"} err="failed to get container status \"1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051\": rpc error: code = NotFound desc = could not find container \"1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051\": container with ID starting with 1ea5fa6432d4b81778294382a4ec09ac70111387f9a6ea8e463c289eb696c051 not found: ID does not exist" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.651065 4750 scope.go:117] "RemoveContainer" containerID="3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27" Oct 08 18:31:44 crc kubenswrapper[4750]: E1008 18:31:44.651524 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27\": container with ID starting with 3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27 not found: ID does not exist" containerID="3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.651560 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27"} err="failed to get container status \"3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27\": rpc error: code = NotFound desc = could not find container \"3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27\": container with ID starting with 3c8343be783dcee750325e416eea9af5780cc5a6c8f96058e4257b642ee93b27 not found: ID does not exist" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.651577 4750 scope.go:117] "RemoveContainer" containerID="148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e" Oct 08 18:31:44 crc kubenswrapper[4750]: E1008 18:31:44.651817 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e\": container with ID starting with 148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e not found: ID does not exist" containerID="148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.651847 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e"} err="failed to get container status \"148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e\": rpc error: code = NotFound desc = could not find container \"148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e\": container with ID starting with 148da3e0d1bb9d40e19fbc06b08276f602828a60fe6c71caab07af65fed0297e not found: ID does not exist" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.651868 4750 scope.go:117] "RemoveContainer" containerID="1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9" Oct 08 18:31:44 crc kubenswrapper[4750]: E1008 18:31:44.652121 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9\": container with ID starting with 1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9 not found: ID does not exist" containerID="1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.652210 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9"} err="failed to get container status \"1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9\": rpc error: code = NotFound desc = could not find container \"1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9\": container with ID starting with 1364739c761c3065bfc80f1d7b2eaabc00aad0bb9bf3a44b0b16f89ab3bb69c9 not found: ID does not exist" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.671107 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9dae8f-58c7-4841-82f8-9cb0b37af29b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.804997 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.813442 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828015 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:44 crc kubenswrapper[4750]: E1008 18:31:44.828453 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="sg-core" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828503 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="sg-core" Oct 08 18:31:44 crc kubenswrapper[4750]: E1008 18:31:44.828534 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="ceilometer-central-agent" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828560 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="ceilometer-central-agent" Oct 08 18:31:44 crc kubenswrapper[4750]: E1008 18:31:44.828588 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="proxy-httpd" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828595 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="proxy-httpd" Oct 08 18:31:44 crc kubenswrapper[4750]: E1008 18:31:44.828611 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="ceilometer-notification-agent" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828618 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="ceilometer-notification-agent" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828820 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="ceilometer-notification-agent" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828836 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="proxy-httpd" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828860 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="sg-core" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.828880 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" containerName="ceilometer-central-agent" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.831339 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.833941 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.834247 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.834416 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.839835 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.851247 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.851290 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.879773 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.975994 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-log-httpd\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.976122 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.976161 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-config-data\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.976186 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-scripts\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.976201 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-run-httpd\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.976251 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn4cq\" (UniqueName: \"kubernetes.io/projected/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-kube-api-access-cn4cq\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.976270 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:44 crc kubenswrapper[4750]: I1008 18:31:44.976293 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.078268 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-log-httpd\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.078355 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.078383 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-config-data\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.078406 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-scripts\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.078427 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-run-httpd\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.078460 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn4cq\" (UniqueName: \"kubernetes.io/projected/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-kube-api-access-cn4cq\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.078481 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.078503 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.081435 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-run-httpd\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.081543 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-log-httpd\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.085033 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.085034 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.089068 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.089377 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-config-data\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.089496 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-scripts\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.098229 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn4cq\" (UniqueName: \"kubernetes.io/projected/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-kube-api-access-cn4cq\") pod \"ceilometer-0\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.167667 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.647272 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:45 crc kubenswrapper[4750]: W1008 18:31:45.648521 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b0cbcf_9157_4cfb_b0df_f7d5cf74add7.slice/crio-faf837e50dbd714ddc8a6fb1c3ef7500705b7847f96faca692890c001fe5e82e WatchSource:0}: Error finding container faf837e50dbd714ddc8a6fb1c3ef7500705b7847f96faca692890c001fe5e82e: Status 404 returned error can't find the container with id faf837e50dbd714ddc8a6fb1c3ef7500705b7847f96faca692890c001fe5e82e Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.866823 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.866864 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.934964 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 18:31:45 crc kubenswrapper[4750]: I1008 18:31:45.936653 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 18:31:46 crc kubenswrapper[4750]: I1008 18:31:46.495773 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerStarted","Data":"4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e"} Oct 08 18:31:46 crc kubenswrapper[4750]: I1008 18:31:46.495823 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerStarted","Data":"faf837e50dbd714ddc8a6fb1c3ef7500705b7847f96faca692890c001fe5e82e"} Oct 08 18:31:46 crc kubenswrapper[4750]: I1008 18:31:46.743611 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9dae8f-58c7-4841-82f8-9cb0b37af29b" path="/var/lib/kubelet/pods/1b9dae8f-58c7-4841-82f8-9cb0b37af29b/volumes" Oct 08 18:31:46 crc kubenswrapper[4750]: I1008 18:31:46.786215 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 18:31:46 crc kubenswrapper[4750]: I1008 18:31:46.819300 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 18:31:46 crc kubenswrapper[4750]: I1008 18:31:46.877681 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 18:31:46 crc kubenswrapper[4750]: I1008 18:31:46.877934 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 18:31:47 crc kubenswrapper[4750]: I1008 18:31:47.518645 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerStarted","Data":"94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1"} Oct 08 18:31:47 crc kubenswrapper[4750]: I1008 18:31:47.546660 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 18:31:48 crc kubenswrapper[4750]: I1008 18:31:48.529686 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerStarted","Data":"12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d"} Oct 08 18:31:50 crc kubenswrapper[4750]: I1008 18:31:50.561884 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerStarted","Data":"e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e"} Oct 08 18:31:50 crc kubenswrapper[4750]: I1008 18:31:50.562500 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 18:31:50 crc kubenswrapper[4750]: I1008 18:31:50.585433 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.672438763 podStartE2EDuration="6.585417225s" podCreationTimestamp="2025-10-08 18:31:44 +0000 UTC" firstStartedPulling="2025-10-08 18:31:45.651150468 +0000 UTC m=+1261.564121481" lastFinishedPulling="2025-10-08 18:31:49.56412893 +0000 UTC m=+1265.477099943" observedRunningTime="2025-10-08 18:31:50.580974227 +0000 UTC m=+1266.493945240" watchObservedRunningTime="2025-10-08 18:31:50.585417225 +0000 UTC m=+1266.498388238" Oct 08 18:31:50 crc kubenswrapper[4750]: I1008 18:31:50.838868 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 18:31:54 crc kubenswrapper[4750]: I1008 18:31:54.853166 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 18:31:54 crc kubenswrapper[4750]: I1008 18:31:54.854435 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 18:31:54 crc kubenswrapper[4750]: I1008 18:31:54.854721 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 18:31:54 crc kubenswrapper[4750]: I1008 18:31:54.858396 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.606644 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.609933 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.783868 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-84nq2"] Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.785276 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.804206 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-84nq2"] Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.873971 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.876258 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.882219 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.890565 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-svc\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.890642 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-config\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.890752 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tqml\" (UniqueName: \"kubernetes.io/projected/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-kube-api-access-8tqml\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.890806 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-swift-storage-0\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.890842 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.890909 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.992534 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-swift-storage-0\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.993115 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.993160 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.993196 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-svc\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.993227 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-config\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.993287 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tqml\" (UniqueName: \"kubernetes.io/projected/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-kube-api-access-8tqml\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.993395 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-swift-storage-0\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.994062 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.994296 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-svc\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.994396 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-config\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:55 crc kubenswrapper[4750]: I1008 18:31:55.994887 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-nb\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:56 crc kubenswrapper[4750]: I1008 18:31:56.020396 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tqml\" (UniqueName: \"kubernetes.io/projected/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-kube-api-access-8tqml\") pod \"dnsmasq-dns-6d4d96bb9-84nq2\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:56 crc kubenswrapper[4750]: I1008 18:31:56.111030 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:56 crc kubenswrapper[4750]: I1008 18:31:56.586018 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-84nq2"] Oct 08 18:31:56 crc kubenswrapper[4750]: W1008 18:31:56.589486 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfa3814f_a0f4_4d53_9c08_44d7b45dd662.slice/crio-549dcd959f15f574b0307661b07bb0215a375979613dfe4cd26c7e6bb3eec6ba WatchSource:0}: Error finding container 549dcd959f15f574b0307661b07bb0215a375979613dfe4cd26c7e6bb3eec6ba: Status 404 returned error can't find the container with id 549dcd959f15f574b0307661b07bb0215a375979613dfe4cd26c7e6bb3eec6ba Oct 08 18:31:56 crc kubenswrapper[4750]: I1008 18:31:56.619238 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" event={"ID":"bfa3814f-a0f4-4d53-9c08-44d7b45dd662","Type":"ContainerStarted","Data":"549dcd959f15f574b0307661b07bb0215a375979613dfe4cd26c7e6bb3eec6ba"} Oct 08 18:31:56 crc kubenswrapper[4750]: I1008 18:31:56.625527 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 18:31:57 crc kubenswrapper[4750]: I1008 18:31:57.545340 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:57 crc kubenswrapper[4750]: I1008 18:31:57.546120 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="ceilometer-central-agent" containerID="cri-o://4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e" gracePeriod=30 Oct 08 18:31:57 crc kubenswrapper[4750]: I1008 18:31:57.546133 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="proxy-httpd" containerID="cri-o://e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e" gracePeriod=30 Oct 08 18:31:57 crc kubenswrapper[4750]: I1008 18:31:57.546226 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="ceilometer-notification-agent" containerID="cri-o://94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1" gracePeriod=30 Oct 08 18:31:57 crc kubenswrapper[4750]: I1008 18:31:57.546239 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="sg-core" containerID="cri-o://12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d" gracePeriod=30 Oct 08 18:31:57 crc kubenswrapper[4750]: I1008 18:31:57.635934 4750 generic.go:334] "Generic (PLEG): container finished" podID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" containerID="2c9770274063e17c68b83e18ee0724852e8b71a48e0e550558f209c61ad073ee" exitCode=0 Oct 08 18:31:57 crc kubenswrapper[4750]: I1008 18:31:57.636032 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" event={"ID":"bfa3814f-a0f4-4d53-9c08-44d7b45dd662","Type":"ContainerDied","Data":"2c9770274063e17c68b83e18ee0724852e8b71a48e0e550558f209c61ad073ee"} Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.181625 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.649066 4750 generic.go:334] "Generic (PLEG): container finished" podID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerID="e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e" exitCode=0 Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.649470 4750 generic.go:334] "Generic (PLEG): container finished" podID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerID="12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d" exitCode=2 Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.649488 4750 generic.go:334] "Generic (PLEG): container finished" podID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerID="4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e" exitCode=0 Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.649144 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerDied","Data":"e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e"} Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.649592 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerDied","Data":"12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d"} Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.649607 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerDied","Data":"4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e"} Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.652342 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-log" containerID="cri-o://fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7" gracePeriod=30 Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.653967 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" event={"ID":"bfa3814f-a0f4-4d53-9c08-44d7b45dd662","Type":"ContainerStarted","Data":"b4f13f7200451000f175c62b160ddb55de7cf3bd3d9beaaa3673047fa05bb9c7"} Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.654008 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.657016 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-api" containerID="cri-o://f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6" gracePeriod=30 Oct 08 18:31:58 crc kubenswrapper[4750]: I1008 18:31:58.686377 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" podStartSLOduration=3.6863597820000003 podStartE2EDuration="3.686359782s" podCreationTimestamp="2025-10-08 18:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:31:58.677207268 +0000 UTC m=+1274.590178301" watchObservedRunningTime="2025-10-08 18:31:58.686359782 +0000 UTC m=+1274.599330785" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.054792 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.070440 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-ceilometer-tls-certs\") pod \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.070506 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-combined-ca-bundle\") pod \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.070537 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-scripts\") pod \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.070680 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn4cq\" (UniqueName: \"kubernetes.io/projected/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-kube-api-access-cn4cq\") pod \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.070783 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-run-httpd\") pod \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.070982 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-config-data\") pod \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.071013 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-sg-core-conf-yaml\") pod \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.071035 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-log-httpd\") pod \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\" (UID: \"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.071422 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" (UID: "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.071674 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.071726 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" (UID: "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.079601 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-scripts" (OuterVolumeSpecName: "scripts") pod "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" (UID: "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.092157 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-kube-api-access-cn4cq" (OuterVolumeSpecName: "kube-api-access-cn4cq") pod "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" (UID: "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7"). InnerVolumeSpecName "kube-api-access-cn4cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.126267 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" (UID: "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.159128 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" (UID: "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.172290 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" (UID: "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.173681 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.173710 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.173721 4750 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.173734 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.173744 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.173756 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn4cq\" (UniqueName: \"kubernetes.io/projected/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-kube-api-access-cn4cq\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.220063 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-config-data" (OuterVolumeSpecName: "config-data") pod "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" (UID: "68b0cbcf-9157-4cfb-b0df-f7d5cf74add7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.275806 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.500421 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.581426 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvxmh\" (UniqueName: \"kubernetes.io/projected/924f1d18-46bd-420b-8250-6100ae1c7120-kube-api-access-bvxmh\") pod \"924f1d18-46bd-420b-8250-6100ae1c7120\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.581656 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-combined-ca-bundle\") pod \"924f1d18-46bd-420b-8250-6100ae1c7120\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.581817 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-config-data\") pod \"924f1d18-46bd-420b-8250-6100ae1c7120\" (UID: \"924f1d18-46bd-420b-8250-6100ae1c7120\") " Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.587498 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/924f1d18-46bd-420b-8250-6100ae1c7120-kube-api-access-bvxmh" (OuterVolumeSpecName: "kube-api-access-bvxmh") pod "924f1d18-46bd-420b-8250-6100ae1c7120" (UID: "924f1d18-46bd-420b-8250-6100ae1c7120"). InnerVolumeSpecName "kube-api-access-bvxmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.607949 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-config-data" (OuterVolumeSpecName: "config-data") pod "924f1d18-46bd-420b-8250-6100ae1c7120" (UID: "924f1d18-46bd-420b-8250-6100ae1c7120"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.613194 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "924f1d18-46bd-420b-8250-6100ae1c7120" (UID: "924f1d18-46bd-420b-8250-6100ae1c7120"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.668299 4750 generic.go:334] "Generic (PLEG): container finished" podID="85db2d5f-53d2-4875-b334-8b3689922391" containerID="fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7" exitCode=143 Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.668391 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85db2d5f-53d2-4875-b334-8b3689922391","Type":"ContainerDied","Data":"fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7"} Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.670864 4750 generic.go:334] "Generic (PLEG): container finished" podID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerID="94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1" exitCode=0 Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.670909 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerDied","Data":"94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1"} Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.670935 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b0cbcf-9157-4cfb-b0df-f7d5cf74add7","Type":"ContainerDied","Data":"faf837e50dbd714ddc8a6fb1c3ef7500705b7847f96faca692890c001fe5e82e"} Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.670951 4750 scope.go:117] "RemoveContainer" containerID="e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.671069 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.674024 4750 generic.go:334] "Generic (PLEG): container finished" podID="924f1d18-46bd-420b-8250-6100ae1c7120" containerID="e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63" exitCode=137 Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.674079 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.674096 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"924f1d18-46bd-420b-8250-6100ae1c7120","Type":"ContainerDied","Data":"e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63"} Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.674284 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"924f1d18-46bd-420b-8250-6100ae1c7120","Type":"ContainerDied","Data":"ef3acda9cc3a846bd768d47f43084e805f1568bdcf8bd4589ecfdfa8032c1aed"} Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.683228 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.683250 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/924f1d18-46bd-420b-8250-6100ae1c7120-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.683258 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvxmh\" (UniqueName: \"kubernetes.io/projected/924f1d18-46bd-420b-8250-6100ae1c7120-kube-api-access-bvxmh\") on node \"crc\" DevicePath \"\"" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.754414 4750 scope.go:117] "RemoveContainer" containerID="12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.765210 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.780911 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.795238 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.799695 4750 scope.go:117] "RemoveContainer" containerID="94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.828371 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.828473 4750 scope.go:117] "RemoveContainer" containerID="4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.829794 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.830226 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="ceilometer-notification-agent" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830243 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="ceilometer-notification-agent" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.830258 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="ceilometer-central-agent" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830264 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="ceilometer-central-agent" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.830280 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="sg-core" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830285 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="sg-core" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.830295 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="proxy-httpd" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830300 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="proxy-httpd" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.830315 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924f1d18-46bd-420b-8250-6100ae1c7120" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830321 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="924f1d18-46bd-420b-8250-6100ae1c7120" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830479 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="sg-core" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830496 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="ceilometer-central-agent" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830511 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="proxy-httpd" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830523 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" containerName="ceilometer-notification-agent" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.830533 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="924f1d18-46bd-420b-8250-6100ae1c7120" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.832152 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.834583 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.834843 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.834966 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.837718 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.839155 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.840853 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.841035 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.841145 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.846695 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.855731 4750 scope.go:117] "RemoveContainer" containerID="e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.856201 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e\": container with ID starting with e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e not found: ID does not exist" containerID="e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.856246 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e"} err="failed to get container status \"e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e\": rpc error: code = NotFound desc = could not find container \"e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e\": container with ID starting with e79a0406f3121c11705a01fbc436441e903563b724060791c32013cbc39d322e not found: ID does not exist" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.856274 4750 scope.go:117] "RemoveContainer" containerID="12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.858656 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d\": container with ID starting with 12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d not found: ID does not exist" containerID="12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.858688 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d"} err="failed to get container status \"12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d\": rpc error: code = NotFound desc = could not find container \"12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d\": container with ID starting with 12cc86dd77bd20ed0c3692022051cdf553346bbaedc2e90138dc5797735d6e0d not found: ID does not exist" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.858710 4750 scope.go:117] "RemoveContainer" containerID="94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.858984 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.860524 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1\": container with ID starting with 94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1 not found: ID does not exist" containerID="94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.860599 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1"} err="failed to get container status \"94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1\": rpc error: code = NotFound desc = could not find container \"94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1\": container with ID starting with 94dcc9e4ab3d6a286dcba8c3ca560440bb255c0ad7898ae46de3fb41892f8af1 not found: ID does not exist" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.860654 4750 scope.go:117] "RemoveContainer" containerID="4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.861405 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e\": container with ID starting with 4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e not found: ID does not exist" containerID="4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.861449 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e"} err="failed to get container status \"4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e\": rpc error: code = NotFound desc = could not find container \"4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e\": container with ID starting with 4ab5e26cd9f018da3dfa800d543d53fc54df18c14d51eadde10ba9a5112e980e not found: ID does not exist" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.861464 4750 scope.go:117] "RemoveContainer" containerID="e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891070 4750 scope.go:117] "RemoveContainer" containerID="e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891086 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891227 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsnwh\" (UniqueName: \"kubernetes.io/projected/fe7385d5-3c78-4238-96be-78392eddee4b-kube-api-access-hsnwh\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891314 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjz2\" (UniqueName: \"kubernetes.io/projected/c4cbc20b-7898-4a47-99f6-80436897042c-kube-api-access-vsjz2\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891412 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-log-httpd\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891456 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891490 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-run-httpd\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891513 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-scripts\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891574 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-config-data\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891620 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891668 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891728 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891770 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.891801 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.892738 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63\": container with ID starting with e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63 not found: ID does not exist" containerID="e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.892780 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63"} err="failed to get container status \"e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63\": rpc error: code = NotFound desc = could not find container \"e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63\": container with ID starting with e98db408712902040839723e74638fedfb88161852d8e65e86a8ac80bfc7ab63 not found: ID does not exist" Oct 08 18:31:59 crc kubenswrapper[4750]: E1008 18:31:59.911896 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod924f1d18_46bd_420b_8250_6100ae1c7120.slice/crio-ef3acda9cc3a846bd768d47f43084e805f1568bdcf8bd4589ecfdfa8032c1aed\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod924f1d18_46bd_420b_8250_6100ae1c7120.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b0cbcf_9157_4cfb_b0df_f7d5cf74add7.slice\": RecentStats: unable to find data in memory cache]" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995459 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjz2\" (UniqueName: \"kubernetes.io/projected/c4cbc20b-7898-4a47-99f6-80436897042c-kube-api-access-vsjz2\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995522 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-log-httpd\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995556 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995582 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-run-httpd\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995600 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-scripts\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995628 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-config-data\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995655 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995687 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995722 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995744 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995764 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995792 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.995813 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsnwh\" (UniqueName: \"kubernetes.io/projected/fe7385d5-3c78-4238-96be-78392eddee4b-kube-api-access-hsnwh\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.997139 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-run-httpd\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:31:59 crc kubenswrapper[4750]: I1008 18:31:59.999600 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-log-httpd\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.000831 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-scripts\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.002339 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-config-data\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.002356 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.003231 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.007484 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.008036 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.008094 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.008182 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.010576 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsnwh\" (UniqueName: \"kubernetes.io/projected/fe7385d5-3c78-4238-96be-78392eddee4b-kube-api-access-hsnwh\") pod \"ceilometer-0\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " pod="openstack/ceilometer-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.011986 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjz2\" (UniqueName: \"kubernetes.io/projected/c4cbc20b-7898-4a47-99f6-80436897042c-kube-api-access-vsjz2\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.012185 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.162301 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.176801 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.627039 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:32:00 crc kubenswrapper[4750]: W1008 18:32:00.633874 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe7385d5_3c78_4238_96be_78392eddee4b.slice/crio-769d822cdc11463d5e21060d884a2f91d15ce758b94be8b5865d2a388641f9e6 WatchSource:0}: Error finding container 769d822cdc11463d5e21060d884a2f91d15ce758b94be8b5865d2a388641f9e6: Status 404 returned error can't find the container with id 769d822cdc11463d5e21060d884a2f91d15ce758b94be8b5865d2a388641f9e6 Oct 08 18:32:00 crc kubenswrapper[4750]: W1008 18:32:00.638839 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4cbc20b_7898_4a47_99f6_80436897042c.slice/crio-eb6209054bb93711f8a78b8668dce20ee628e400ffe428c429f8a1d1ba06bff3 WatchSource:0}: Error finding container eb6209054bb93711f8a78b8668dce20ee628e400ffe428c429f8a1d1ba06bff3: Status 404 returned error can't find the container with id eb6209054bb93711f8a78b8668dce20ee628e400ffe428c429f8a1d1ba06bff3 Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.640948 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.689064 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c4cbc20b-7898-4a47-99f6-80436897042c","Type":"ContainerStarted","Data":"eb6209054bb93711f8a78b8668dce20ee628e400ffe428c429f8a1d1ba06bff3"} Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.690993 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerStarted","Data":"769d822cdc11463d5e21060d884a2f91d15ce758b94be8b5865d2a388641f9e6"} Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.751139 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b0cbcf-9157-4cfb-b0df-f7d5cf74add7" path="/var/lib/kubelet/pods/68b0cbcf-9157-4cfb-b0df-f7d5cf74add7/volumes" Oct 08 18:32:00 crc kubenswrapper[4750]: I1008 18:32:00.752017 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="924f1d18-46bd-420b-8250-6100ae1c7120" path="/var/lib/kubelet/pods/924f1d18-46bd-420b-8250-6100ae1c7120/volumes" Oct 08 18:32:01 crc kubenswrapper[4750]: I1008 18:32:01.717017 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c4cbc20b-7898-4a47-99f6-80436897042c","Type":"ContainerStarted","Data":"3f09bf21e5c4d3c261d96fe4ae9e3aa7f61c0a08319b33de766658446000ec50"} Oct 08 18:32:01 crc kubenswrapper[4750]: I1008 18:32:01.721632 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerStarted","Data":"225967b62a0a3c9db87ba69878ef83c98fd4d002b6b485a3ba2e44f7c9932962"} Oct 08 18:32:01 crc kubenswrapper[4750]: I1008 18:32:01.741958 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.741938935 podStartE2EDuration="2.741938935s" podCreationTimestamp="2025-10-08 18:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:32:01.734351319 +0000 UTC m=+1277.647322362" watchObservedRunningTime="2025-10-08 18:32:01.741938935 +0000 UTC m=+1277.654909948" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.278188 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.352998 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-config-data\") pod \"85db2d5f-53d2-4875-b334-8b3689922391\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.355131 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vctmd\" (UniqueName: \"kubernetes.io/projected/85db2d5f-53d2-4875-b334-8b3689922391-kube-api-access-vctmd\") pod \"85db2d5f-53d2-4875-b334-8b3689922391\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.355264 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85db2d5f-53d2-4875-b334-8b3689922391-logs\") pod \"85db2d5f-53d2-4875-b334-8b3689922391\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.355350 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-combined-ca-bundle\") pod \"85db2d5f-53d2-4875-b334-8b3689922391\" (UID: \"85db2d5f-53d2-4875-b334-8b3689922391\") " Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.356580 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85db2d5f-53d2-4875-b334-8b3689922391-logs" (OuterVolumeSpecName: "logs") pod "85db2d5f-53d2-4875-b334-8b3689922391" (UID: "85db2d5f-53d2-4875-b334-8b3689922391"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.360990 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85db2d5f-53d2-4875-b334-8b3689922391-kube-api-access-vctmd" (OuterVolumeSpecName: "kube-api-access-vctmd") pod "85db2d5f-53d2-4875-b334-8b3689922391" (UID: "85db2d5f-53d2-4875-b334-8b3689922391"). InnerVolumeSpecName "kube-api-access-vctmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.392135 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-config-data" (OuterVolumeSpecName: "config-data") pod "85db2d5f-53d2-4875-b334-8b3689922391" (UID: "85db2d5f-53d2-4875-b334-8b3689922391"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.411018 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85db2d5f-53d2-4875-b334-8b3689922391" (UID: "85db2d5f-53d2-4875-b334-8b3689922391"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.459013 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.459052 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vctmd\" (UniqueName: \"kubernetes.io/projected/85db2d5f-53d2-4875-b334-8b3689922391-kube-api-access-vctmd\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.459066 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85db2d5f-53d2-4875-b334-8b3689922391-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.459079 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85db2d5f-53d2-4875-b334-8b3689922391-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.732328 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerStarted","Data":"ef0cea034999dc054ed498ebdf790b29ef4af6c72ed138ea7f570a475564630d"} Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.732420 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerStarted","Data":"19ff8cd5ce4ea2ca5a48e87f8b09c6d1da6004603a55895cb9163299bb16a295"} Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.739472 4750 generic.go:334] "Generic (PLEG): container finished" podID="85db2d5f-53d2-4875-b334-8b3689922391" containerID="f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6" exitCode=0 Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.740103 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.749813 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85db2d5f-53d2-4875-b334-8b3689922391","Type":"ContainerDied","Data":"f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6"} Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.749857 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85db2d5f-53d2-4875-b334-8b3689922391","Type":"ContainerDied","Data":"8c1445e38159c1fb76ae461a1d9e4a6d90e6e3ab9a816965938ba2a5322739af"} Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.749874 4750 scope.go:117] "RemoveContainer" containerID="f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.773530 4750 scope.go:117] "RemoveContainer" containerID="fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.773662 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.782432 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.797349 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:02 crc kubenswrapper[4750]: E1008 18:32:02.798219 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-log" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.798232 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-log" Oct 08 18:32:02 crc kubenswrapper[4750]: E1008 18:32:02.798257 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-api" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.798263 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-api" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.798628 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-log" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.798643 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="85db2d5f-53d2-4875-b334-8b3689922391" containerName="nova-api-api" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.800520 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.800933 4750 scope.go:117] "RemoveContainer" containerID="f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6" Oct 08 18:32:02 crc kubenswrapper[4750]: E1008 18:32:02.801962 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6\": container with ID starting with f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6 not found: ID does not exist" containerID="f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.802163 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6"} err="failed to get container status \"f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6\": rpc error: code = NotFound desc = could not find container \"f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6\": container with ID starting with f62652d1a85f2cd9154f4bdc39a53f179e23061e3f9401355832d5c8c6907da6 not found: ID does not exist" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.802297 4750 scope.go:117] "RemoveContainer" containerID="fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7" Oct 08 18:32:02 crc kubenswrapper[4750]: E1008 18:32:02.802933 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7\": container with ID starting with fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7 not found: ID does not exist" containerID="fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.802972 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7"} err="failed to get container status \"fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7\": rpc error: code = NotFound desc = could not find container \"fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7\": container with ID starting with fbb43a2bb1a642127ed4909d702e507174f5e1278c2f9af80dcc642f5cf15ae7 not found: ID does not exist" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.803127 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.803149 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.803952 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.818853 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.867369 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtz7\" (UniqueName: \"kubernetes.io/projected/f19173b1-326c-4494-9781-a8295882102b-kube-api-access-twtz7\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.867715 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19173b1-326c-4494-9781-a8295882102b-logs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.867876 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-config-data\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.868018 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-public-tls-certs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.868139 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.868308 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.969785 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.970684 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twtz7\" (UniqueName: \"kubernetes.io/projected/f19173b1-326c-4494-9781-a8295882102b-kube-api-access-twtz7\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.971465 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19173b1-326c-4494-9781-a8295882102b-logs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.971569 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-config-data\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.971630 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-public-tls-certs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.971681 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.973274 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19173b1-326c-4494-9781-a8295882102b-logs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.977388 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-public-tls-certs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.979479 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.979668 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-config-data\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.981668 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:02 crc kubenswrapper[4750]: I1008 18:32:02.988091 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtz7\" (UniqueName: \"kubernetes.io/projected/f19173b1-326c-4494-9781-a8295882102b-kube-api-access-twtz7\") pod \"nova-api-0\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " pod="openstack/nova-api-0" Oct 08 18:32:03 crc kubenswrapper[4750]: I1008 18:32:03.160373 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:32:03 crc kubenswrapper[4750]: I1008 18:32:03.616374 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:03 crc kubenswrapper[4750]: I1008 18:32:03.753329 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f19173b1-326c-4494-9781-a8295882102b","Type":"ContainerStarted","Data":"9036682e7aeb7d321be2ded71d4bf8d3291435e969e4a74dd5a0e76e237098c2"} Oct 08 18:32:04 crc kubenswrapper[4750]: I1008 18:32:04.758158 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85db2d5f-53d2-4875-b334-8b3689922391" path="/var/lib/kubelet/pods/85db2d5f-53d2-4875-b334-8b3689922391/volumes" Oct 08 18:32:04 crc kubenswrapper[4750]: I1008 18:32:04.766458 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f19173b1-326c-4494-9781-a8295882102b","Type":"ContainerStarted","Data":"5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e"} Oct 08 18:32:04 crc kubenswrapper[4750]: I1008 18:32:04.766507 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f19173b1-326c-4494-9781-a8295882102b","Type":"ContainerStarted","Data":"0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3"} Oct 08 18:32:04 crc kubenswrapper[4750]: I1008 18:32:04.769580 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerStarted","Data":"5eedb186a864de7ffc8ffb0c7d12aa7cbfd51fbb9c7ce15f42d2611d1dc2df3a"} Oct 08 18:32:04 crc kubenswrapper[4750]: I1008 18:32:04.769763 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 18:32:04 crc kubenswrapper[4750]: I1008 18:32:04.855965 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8559406960000002 podStartE2EDuration="2.855940696s" podCreationTimestamp="2025-10-08 18:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:32:04.839801442 +0000 UTC m=+1280.752772465" watchObservedRunningTime="2025-10-08 18:32:04.855940696 +0000 UTC m=+1280.768911709" Oct 08 18:32:04 crc kubenswrapper[4750]: I1008 18:32:04.869443 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.265555119 podStartE2EDuration="5.869424335s" podCreationTimestamp="2025-10-08 18:31:59 +0000 UTC" firstStartedPulling="2025-10-08 18:32:00.637335123 +0000 UTC m=+1276.550306136" lastFinishedPulling="2025-10-08 18:32:04.241204339 +0000 UTC m=+1280.154175352" observedRunningTime="2025-10-08 18:32:04.857062893 +0000 UTC m=+1280.770033916" watchObservedRunningTime="2025-10-08 18:32:04.869424335 +0000 UTC m=+1280.782395348" Oct 08 18:32:05 crc kubenswrapper[4750]: I1008 18:32:05.177924 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.114690 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.197257 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-tl7wv"] Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.197736 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" podUID="f7968787-4100-4e44-b289-0511fe895128" containerName="dnsmasq-dns" containerID="cri-o://a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750" gracePeriod=10 Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.692880 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.741259 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-svc\") pod \"f7968787-4100-4e44-b289-0511fe895128\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.741314 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-sb\") pod \"f7968787-4100-4e44-b289-0511fe895128\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.741501 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fwzl\" (UniqueName: \"kubernetes.io/projected/f7968787-4100-4e44-b289-0511fe895128-kube-api-access-4fwzl\") pod \"f7968787-4100-4e44-b289-0511fe895128\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.741528 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-config\") pod \"f7968787-4100-4e44-b289-0511fe895128\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.741581 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-swift-storage-0\") pod \"f7968787-4100-4e44-b289-0511fe895128\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.741608 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-nb\") pod \"f7968787-4100-4e44-b289-0511fe895128\" (UID: \"f7968787-4100-4e44-b289-0511fe895128\") " Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.763168 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7968787-4100-4e44-b289-0511fe895128-kube-api-access-4fwzl" (OuterVolumeSpecName: "kube-api-access-4fwzl") pod "f7968787-4100-4e44-b289-0511fe895128" (UID: "f7968787-4100-4e44-b289-0511fe895128"). InnerVolumeSpecName "kube-api-access-4fwzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.793237 4750 generic.go:334] "Generic (PLEG): container finished" podID="f7968787-4100-4e44-b289-0511fe895128" containerID="a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750" exitCode=0 Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.793330 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.798356 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7968787-4100-4e44-b289-0511fe895128" (UID: "f7968787-4100-4e44-b289-0511fe895128"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.815414 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f7968787-4100-4e44-b289-0511fe895128" (UID: "f7968787-4100-4e44-b289-0511fe895128"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.815477 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7968787-4100-4e44-b289-0511fe895128" (UID: "f7968787-4100-4e44-b289-0511fe895128"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.820978 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7968787-4100-4e44-b289-0511fe895128" (UID: "f7968787-4100-4e44-b289-0511fe895128"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.844067 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.844100 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.844113 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fwzl\" (UniqueName: \"kubernetes.io/projected/f7968787-4100-4e44-b289-0511fe895128-kube-api-access-4fwzl\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.844125 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.844137 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.848009 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-config" (OuterVolumeSpecName: "config") pod "f7968787-4100-4e44-b289-0511fe895128" (UID: "f7968787-4100-4e44-b289-0511fe895128"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.850103 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" event={"ID":"f7968787-4100-4e44-b289-0511fe895128","Type":"ContainerDied","Data":"a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750"} Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.850136 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc974fdf-tl7wv" event={"ID":"f7968787-4100-4e44-b289-0511fe895128","Type":"ContainerDied","Data":"948d4d1fc08cc6a261b2b0e6273ecc24d68f6ca1a93578b39fea5101cb3169ff"} Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.850153 4750 scope.go:117] "RemoveContainer" containerID="a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.869094 4750 scope.go:117] "RemoveContainer" containerID="122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.894735 4750 scope.go:117] "RemoveContainer" containerID="a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750" Oct 08 18:32:06 crc kubenswrapper[4750]: E1008 18:32:06.895157 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750\": container with ID starting with a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750 not found: ID does not exist" containerID="a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.895199 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750"} err="failed to get container status \"a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750\": rpc error: code = NotFound desc = could not find container \"a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750\": container with ID starting with a1e16b95c1e9f4b306eabd552266a1ee9b208c2b5a53dcd2a29124e046deb750 not found: ID does not exist" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.895227 4750 scope.go:117] "RemoveContainer" containerID="122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350" Oct 08 18:32:06 crc kubenswrapper[4750]: E1008 18:32:06.895568 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350\": container with ID starting with 122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350 not found: ID does not exist" containerID="122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.895597 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350"} err="failed to get container status \"122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350\": rpc error: code = NotFound desc = could not find container \"122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350\": container with ID starting with 122c652fb6fa64bb4ca43c1395d1689ed663f70dc2e7d93218a62b55f12cc350 not found: ID does not exist" Oct 08 18:32:06 crc kubenswrapper[4750]: I1008 18:32:06.946223 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7968787-4100-4e44-b289-0511fe895128-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:07 crc kubenswrapper[4750]: I1008 18:32:07.156204 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-tl7wv"] Oct 08 18:32:07 crc kubenswrapper[4750]: I1008 18:32:07.171355 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffc974fdf-tl7wv"] Oct 08 18:32:08 crc kubenswrapper[4750]: I1008 18:32:08.749002 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7968787-4100-4e44-b289-0511fe895128" path="/var/lib/kubelet/pods/f7968787-4100-4e44-b289-0511fe895128/volumes" Oct 08 18:32:10 crc kubenswrapper[4750]: I1008 18:32:10.187838 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:10 crc kubenswrapper[4750]: I1008 18:32:10.217276 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:10 crc kubenswrapper[4750]: I1008 18:32:10.855280 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.035303 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2klwz"] Oct 08 18:32:11 crc kubenswrapper[4750]: E1008 18:32:11.035829 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7968787-4100-4e44-b289-0511fe895128" containerName="init" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.035853 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7968787-4100-4e44-b289-0511fe895128" containerName="init" Oct 08 18:32:11 crc kubenswrapper[4750]: E1008 18:32:11.035901 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7968787-4100-4e44-b289-0511fe895128" containerName="dnsmasq-dns" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.035911 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7968787-4100-4e44-b289-0511fe895128" containerName="dnsmasq-dns" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.036140 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7968787-4100-4e44-b289-0511fe895128" containerName="dnsmasq-dns" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.036967 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.039326 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.039410 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.047066 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2klwz"] Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.215253 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smgz\" (UniqueName: \"kubernetes.io/projected/5dca9e23-fccb-4dae-98c1-f6670caac28c-kube-api-access-5smgz\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.215367 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-config-data\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.215473 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-scripts\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.215525 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.316914 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smgz\" (UniqueName: \"kubernetes.io/projected/5dca9e23-fccb-4dae-98c1-f6670caac28c-kube-api-access-5smgz\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.317005 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-config-data\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.317095 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-scripts\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.317145 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.323387 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-scripts\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.324313 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-config-data\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.324355 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.334194 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smgz\" (UniqueName: \"kubernetes.io/projected/5dca9e23-fccb-4dae-98c1-f6670caac28c-kube-api-access-5smgz\") pod \"nova-cell1-cell-mapping-2klwz\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.356195 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.821411 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2klwz"] Oct 08 18:32:11 crc kubenswrapper[4750]: W1008 18:32:11.825120 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dca9e23_fccb_4dae_98c1_f6670caac28c.slice/crio-85db75297048506f79d084ccc39254c4d89065af83e38e28ffed23f5a2cdbfe2 WatchSource:0}: Error finding container 85db75297048506f79d084ccc39254c4d89065af83e38e28ffed23f5a2cdbfe2: Status 404 returned error can't find the container with id 85db75297048506f79d084ccc39254c4d89065af83e38e28ffed23f5a2cdbfe2 Oct 08 18:32:11 crc kubenswrapper[4750]: I1008 18:32:11.851003 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2klwz" event={"ID":"5dca9e23-fccb-4dae-98c1-f6670caac28c","Type":"ContainerStarted","Data":"85db75297048506f79d084ccc39254c4d89065af83e38e28ffed23f5a2cdbfe2"} Oct 08 18:32:12 crc kubenswrapper[4750]: I1008 18:32:12.861497 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2klwz" event={"ID":"5dca9e23-fccb-4dae-98c1-f6670caac28c","Type":"ContainerStarted","Data":"e214212500f39b2cfd5d5ceb30ff46551ecb5483a36e38847a82604576ad8136"} Oct 08 18:32:12 crc kubenswrapper[4750]: I1008 18:32:12.883943 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2klwz" podStartSLOduration=1.883926059 podStartE2EDuration="1.883926059s" podCreationTimestamp="2025-10-08 18:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:32:12.873924925 +0000 UTC m=+1288.786895958" watchObservedRunningTime="2025-10-08 18:32:12.883926059 +0000 UTC m=+1288.796897072" Oct 08 18:32:13 crc kubenswrapper[4750]: I1008 18:32:13.161612 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 18:32:13 crc kubenswrapper[4750]: I1008 18:32:13.161940 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 18:32:14 crc kubenswrapper[4750]: I1008 18:32:14.175912 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 18:32:14 crc kubenswrapper[4750]: I1008 18:32:14.175958 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 18:32:16 crc kubenswrapper[4750]: I1008 18:32:16.904829 4750 generic.go:334] "Generic (PLEG): container finished" podID="5dca9e23-fccb-4dae-98c1-f6670caac28c" containerID="e214212500f39b2cfd5d5ceb30ff46551ecb5483a36e38847a82604576ad8136" exitCode=0 Oct 08 18:32:16 crc kubenswrapper[4750]: I1008 18:32:16.904896 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2klwz" event={"ID":"5dca9e23-fccb-4dae-98c1-f6670caac28c","Type":"ContainerDied","Data":"e214212500f39b2cfd5d5ceb30ff46551ecb5483a36e38847a82604576ad8136"} Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.351853 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.484413 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-scripts\") pod \"5dca9e23-fccb-4dae-98c1-f6670caac28c\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.484470 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-config-data\") pod \"5dca9e23-fccb-4dae-98c1-f6670caac28c\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.484629 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-combined-ca-bundle\") pod \"5dca9e23-fccb-4dae-98c1-f6670caac28c\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.484690 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5smgz\" (UniqueName: \"kubernetes.io/projected/5dca9e23-fccb-4dae-98c1-f6670caac28c-kube-api-access-5smgz\") pod \"5dca9e23-fccb-4dae-98c1-f6670caac28c\" (UID: \"5dca9e23-fccb-4dae-98c1-f6670caac28c\") " Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.492267 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dca9e23-fccb-4dae-98c1-f6670caac28c-kube-api-access-5smgz" (OuterVolumeSpecName: "kube-api-access-5smgz") pod "5dca9e23-fccb-4dae-98c1-f6670caac28c" (UID: "5dca9e23-fccb-4dae-98c1-f6670caac28c"). InnerVolumeSpecName "kube-api-access-5smgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.497731 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-scripts" (OuterVolumeSpecName: "scripts") pod "5dca9e23-fccb-4dae-98c1-f6670caac28c" (UID: "5dca9e23-fccb-4dae-98c1-f6670caac28c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.513765 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-config-data" (OuterVolumeSpecName: "config-data") pod "5dca9e23-fccb-4dae-98c1-f6670caac28c" (UID: "5dca9e23-fccb-4dae-98c1-f6670caac28c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.515402 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dca9e23-fccb-4dae-98c1-f6670caac28c" (UID: "5dca9e23-fccb-4dae-98c1-f6670caac28c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.587664 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5smgz\" (UniqueName: \"kubernetes.io/projected/5dca9e23-fccb-4dae-98c1-f6670caac28c-kube-api-access-5smgz\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.587701 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.587716 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.587730 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dca9e23-fccb-4dae-98c1-f6670caac28c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.927435 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2klwz" event={"ID":"5dca9e23-fccb-4dae-98c1-f6670caac28c","Type":"ContainerDied","Data":"85db75297048506f79d084ccc39254c4d89065af83e38e28ffed23f5a2cdbfe2"} Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.927473 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85db75297048506f79d084ccc39254c4d89065af83e38e28ffed23f5a2cdbfe2" Oct 08 18:32:18 crc kubenswrapper[4750]: I1008 18:32:18.927519 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2klwz" Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.138493 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.139045 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-log" containerID="cri-o://0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3" gracePeriod=30 Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.139880 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-api" containerID="cri-o://5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e" gracePeriod=30 Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.151951 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.152146 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="15c75a6e-1d37-4345-a579-62cb0ac8c3fe" containerName="nova-scheduler-scheduler" containerID="cri-o://9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551" gracePeriod=30 Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.247426 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.247756 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-log" containerID="cri-o://d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc" gracePeriod=30 Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.247888 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-metadata" containerID="cri-o://a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd" gracePeriod=30 Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.938827 4750 generic.go:334] "Generic (PLEG): container finished" podID="f19173b1-326c-4494-9781-a8295882102b" containerID="0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3" exitCode=143 Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.938866 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f19173b1-326c-4494-9781-a8295882102b","Type":"ContainerDied","Data":"0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3"} Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.942768 4750 generic.go:334] "Generic (PLEG): container finished" podID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerID="d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc" exitCode=143 Oct 08 18:32:19 crc kubenswrapper[4750]: I1008 18:32:19.942810 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d","Type":"ContainerDied","Data":"d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc"} Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.544774 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.733107 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-combined-ca-bundle\") pod \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.733228 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxkr5\" (UniqueName: \"kubernetes.io/projected/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-kube-api-access-xxkr5\") pod \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.733304 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-config-data\") pod \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\" (UID: \"15c75a6e-1d37-4345-a579-62cb0ac8c3fe\") " Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.746127 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-kube-api-access-xxkr5" (OuterVolumeSpecName: "kube-api-access-xxkr5") pod "15c75a6e-1d37-4345-a579-62cb0ac8c3fe" (UID: "15c75a6e-1d37-4345-a579-62cb0ac8c3fe"). InnerVolumeSpecName "kube-api-access-xxkr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.767368 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15c75a6e-1d37-4345-a579-62cb0ac8c3fe" (UID: "15c75a6e-1d37-4345-a579-62cb0ac8c3fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.780783 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-config-data" (OuterVolumeSpecName: "config-data") pod "15c75a6e-1d37-4345-a579-62cb0ac8c3fe" (UID: "15c75a6e-1d37-4345-a579-62cb0ac8c3fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.837222 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.837574 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxkr5\" (UniqueName: \"kubernetes.io/projected/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-kube-api-access-xxkr5\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.837596 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15c75a6e-1d37-4345-a579-62cb0ac8c3fe-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.954124 4750 generic.go:334] "Generic (PLEG): container finished" podID="15c75a6e-1d37-4345-a579-62cb0ac8c3fe" containerID="9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551" exitCode=0 Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.954174 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15c75a6e-1d37-4345-a579-62cb0ac8c3fe","Type":"ContainerDied","Data":"9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551"} Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.954216 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.954241 4750 scope.go:117] "RemoveContainer" containerID="9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.954225 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15c75a6e-1d37-4345-a579-62cb0ac8c3fe","Type":"ContainerDied","Data":"6bd4bcd958ffecb7a4c0fbf6a172399e254402b278f8c9052cbea728cb7c88d9"} Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.993668 4750 scope.go:117] "RemoveContainer" containerID="9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551" Oct 08 18:32:20 crc kubenswrapper[4750]: E1008 18:32:20.994073 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551\": container with ID starting with 9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551 not found: ID does not exist" containerID="9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.994118 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551"} err="failed to get container status \"9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551\": rpc error: code = NotFound desc = could not find container \"9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551\": container with ID starting with 9bad6f98b1b86ebd196d24fe336fd50ac64decc62f1a30579eb4319296bcc551 not found: ID does not exist" Oct 08 18:32:20 crc kubenswrapper[4750]: I1008 18:32:20.996408 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.011975 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.025089 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:32:21 crc kubenswrapper[4750]: E1008 18:32:21.025531 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dca9e23-fccb-4dae-98c1-f6670caac28c" containerName="nova-manage" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.025558 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dca9e23-fccb-4dae-98c1-f6670caac28c" containerName="nova-manage" Oct 08 18:32:21 crc kubenswrapper[4750]: E1008 18:32:21.025580 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c75a6e-1d37-4345-a579-62cb0ac8c3fe" containerName="nova-scheduler-scheduler" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.025589 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c75a6e-1d37-4345-a579-62cb0ac8c3fe" containerName="nova-scheduler-scheduler" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.025779 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dca9e23-fccb-4dae-98c1-f6670caac28c" containerName="nova-manage" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.025798 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="15c75a6e-1d37-4345-a579-62cb0ac8c3fe" containerName="nova-scheduler-scheduler" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.026419 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.041024 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.050768 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.145381 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-config-data\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.145425 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bvf6\" (UniqueName: \"kubernetes.io/projected/dba17973-3023-43ae-9b75-a8e1dc7f16cc-kube-api-access-5bvf6\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.145499 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.248361 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-config-data\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.248438 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bvf6\" (UniqueName: \"kubernetes.io/projected/dba17973-3023-43ae-9b75-a8e1dc7f16cc-kube-api-access-5bvf6\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.248644 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.259796 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.266914 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-config-data\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.268509 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bvf6\" (UniqueName: \"kubernetes.io/projected/dba17973-3023-43ae-9b75-a8e1dc7f16cc-kube-api-access-5bvf6\") pod \"nova-scheduler-0\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.368212 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.856703 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:32:21 crc kubenswrapper[4750]: W1008 18:32:21.864487 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddba17973_3023_43ae_9b75_a8e1dc7f16cc.slice/crio-d6e654b894d5cded9abe8a9c678d1182bce2b85cc38e33fb9bcf87c625b94b36 WatchSource:0}: Error finding container d6e654b894d5cded9abe8a9c678d1182bce2b85cc38e33fb9bcf87c625b94b36: Status 404 returned error can't find the container with id d6e654b894d5cded9abe8a9c678d1182bce2b85cc38e33fb9bcf87c625b94b36 Oct 08 18:32:21 crc kubenswrapper[4750]: I1008 18:32:21.964859 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dba17973-3023-43ae-9b75-a8e1dc7f16cc","Type":"ContainerStarted","Data":"d6e654b894d5cded9abe8a9c678d1182bce2b85cc38e33fb9bcf87c625b94b36"} Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.396767 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:52592->10.217.0.195:8775: read: connection reset by peer" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.396794 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:52606->10.217.0.195:8775: read: connection reset by peer" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.750324 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15c75a6e-1d37-4345-a579-62cb0ac8c3fe" path="/var/lib/kubelet/pods/15c75a6e-1d37-4345-a579-62cb0ac8c3fe/volumes" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.765874 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.870885 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.882430 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twtz7\" (UniqueName: \"kubernetes.io/projected/f19173b1-326c-4494-9781-a8295882102b-kube-api-access-twtz7\") pod \"f19173b1-326c-4494-9781-a8295882102b\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.883344 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-internal-tls-certs\") pod \"f19173b1-326c-4494-9781-a8295882102b\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.883542 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-config-data\") pod \"f19173b1-326c-4494-9781-a8295882102b\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.883693 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-combined-ca-bundle\") pod \"f19173b1-326c-4494-9781-a8295882102b\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.883813 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-public-tls-certs\") pod \"f19173b1-326c-4494-9781-a8295882102b\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.883885 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19173b1-326c-4494-9781-a8295882102b-logs\") pod \"f19173b1-326c-4494-9781-a8295882102b\" (UID: \"f19173b1-326c-4494-9781-a8295882102b\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.885030 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f19173b1-326c-4494-9781-a8295882102b-logs" (OuterVolumeSpecName: "logs") pod "f19173b1-326c-4494-9781-a8295882102b" (UID: "f19173b1-326c-4494-9781-a8295882102b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.891568 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19173b1-326c-4494-9781-a8295882102b-kube-api-access-twtz7" (OuterVolumeSpecName: "kube-api-access-twtz7") pod "f19173b1-326c-4494-9781-a8295882102b" (UID: "f19173b1-326c-4494-9781-a8295882102b"). InnerVolumeSpecName "kube-api-access-twtz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.928888 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f19173b1-326c-4494-9781-a8295882102b" (UID: "f19173b1-326c-4494-9781-a8295882102b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.954396 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-config-data" (OuterVolumeSpecName: "config-data") pod "f19173b1-326c-4494-9781-a8295882102b" (UID: "f19173b1-326c-4494-9781-a8295882102b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.961281 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f19173b1-326c-4494-9781-a8295882102b" (UID: "f19173b1-326c-4494-9781-a8295882102b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.983079 4750 generic.go:334] "Generic (PLEG): container finished" podID="f19173b1-326c-4494-9781-a8295882102b" containerID="5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e" exitCode=0 Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.983210 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.983632 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f19173b1-326c-4494-9781-a8295882102b","Type":"ContainerDied","Data":"5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e"} Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.983681 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f19173b1-326c-4494-9781-a8295882102b","Type":"ContainerDied","Data":"9036682e7aeb7d321be2ded71d4bf8d3291435e969e4a74dd5a0e76e237098c2"} Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.983698 4750 scope.go:117] "RemoveContainer" containerID="5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.986946 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-combined-ca-bundle\") pod \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.987083 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-logs\") pod \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.987106 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlvzh\" (UniqueName: \"kubernetes.io/projected/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-kube-api-access-qlvzh\") pod \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.987136 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-config-data\") pod \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.987285 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-nova-metadata-tls-certs\") pod \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\" (UID: \"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d\") " Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.987804 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-logs" (OuterVolumeSpecName: "logs") pod "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" (UID: "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.988201 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19173b1-326c-4494-9781-a8295882102b-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.988214 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.988225 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twtz7\" (UniqueName: \"kubernetes.io/projected/f19173b1-326c-4494-9781-a8295882102b-kube-api-access-twtz7\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.988237 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.988246 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.988255 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.990115 4750 generic.go:334] "Generic (PLEG): container finished" podID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerID="a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd" exitCode=0 Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.990205 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d","Type":"ContainerDied","Data":"a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd"} Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.990244 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1df3c5c3-627e-4bfb-9dd0-16976cc25f3d","Type":"ContainerDied","Data":"dbfc4f38b84ecd9ecc42d498baaaf06774e43287d385c11988214c2ae759476a"} Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.990301 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.994145 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dba17973-3023-43ae-9b75-a8e1dc7f16cc","Type":"ContainerStarted","Data":"295b9725591cb856aaa6df5c90337486a2c9085eabdd2a6955b57c6bb2f78110"} Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.995445 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f19173b1-326c-4494-9781-a8295882102b" (UID: "f19173b1-326c-4494-9781-a8295882102b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:22 crc kubenswrapper[4750]: I1008 18:32:22.997356 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-kube-api-access-qlvzh" (OuterVolumeSpecName: "kube-api-access-qlvzh") pod "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" (UID: "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d"). InnerVolumeSpecName "kube-api-access-qlvzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.010472 4750 scope.go:117] "RemoveContainer" containerID="0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.016743 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.016725924 podStartE2EDuration="3.016725924s" podCreationTimestamp="2025-10-08 18:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:32:23.008032352 +0000 UTC m=+1298.921003365" watchObservedRunningTime="2025-10-08 18:32:23.016725924 +0000 UTC m=+1298.929696937" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.027391 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-config-data" (OuterVolumeSpecName: "config-data") pod "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" (UID: "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.029028 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" (UID: "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.035825 4750 scope.go:117] "RemoveContainer" containerID="5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e" Oct 08 18:32:23 crc kubenswrapper[4750]: E1008 18:32:23.036533 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e\": container with ID starting with 5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e not found: ID does not exist" containerID="5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.036605 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e"} err="failed to get container status \"5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e\": rpc error: code = NotFound desc = could not find container \"5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e\": container with ID starting with 5021ee4c026ca9f157f7fda91855ac96e719bf6752caff78f473e240b9b67b2e not found: ID does not exist" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.036660 4750 scope.go:117] "RemoveContainer" containerID="0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3" Oct 08 18:32:23 crc kubenswrapper[4750]: E1008 18:32:23.037123 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3\": container with ID starting with 0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3 not found: ID does not exist" containerID="0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.037356 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3"} err="failed to get container status \"0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3\": rpc error: code = NotFound desc = could not find container \"0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3\": container with ID starting with 0254d111587e316a9523bb9f18f579915092e7c31cb6f72ff7b68e04110499d3 not found: ID does not exist" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.037423 4750 scope.go:117] "RemoveContainer" containerID="a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.043329 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" (UID: "1df3c5c3-627e-4bfb-9dd0-16976cc25f3d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.066862 4750 scope.go:117] "RemoveContainer" containerID="d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.086952 4750 scope.go:117] "RemoveContainer" containerID="a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd" Oct 08 18:32:23 crc kubenswrapper[4750]: E1008 18:32:23.087542 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd\": container with ID starting with a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd not found: ID does not exist" containerID="a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.087603 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd"} err="failed to get container status \"a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd\": rpc error: code = NotFound desc = could not find container \"a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd\": container with ID starting with a602b19189755c874600794f7440ec259a3b2c64913b3122661d9227a9bf77dd not found: ID does not exist" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.087655 4750 scope.go:117] "RemoveContainer" containerID="d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc" Oct 08 18:32:23 crc kubenswrapper[4750]: E1008 18:32:23.088006 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc\": container with ID starting with d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc not found: ID does not exist" containerID="d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.088038 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc"} err="failed to get container status \"d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc\": rpc error: code = NotFound desc = could not find container \"d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc\": container with ID starting with d46b15d4798ebc7d868aadfede508f86cd7efdd6d4843dec421ee6b41081accc not found: ID does not exist" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.090351 4750 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.090382 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.090396 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f19173b1-326c-4494-9781-a8295882102b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.090950 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlvzh\" (UniqueName: \"kubernetes.io/projected/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-kube-api-access-qlvzh\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.090964 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.320793 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.333177 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.344339 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.358319 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.367680 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:23 crc kubenswrapper[4750]: E1008 18:32:23.368098 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-log" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.368111 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-log" Oct 08 18:32:23 crc kubenswrapper[4750]: E1008 18:32:23.368124 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-metadata" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.368131 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-metadata" Oct 08 18:32:23 crc kubenswrapper[4750]: E1008 18:32:23.368146 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-log" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.368153 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-log" Oct 08 18:32:23 crc kubenswrapper[4750]: E1008 18:32:23.368178 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-api" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.368185 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-api" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.368379 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-log" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.368396 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-log" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.368419 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f19173b1-326c-4494-9781-a8295882102b" containerName="nova-api-api" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.368434 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" containerName="nova-metadata-metadata" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.369423 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.373456 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.373672 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.373983 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.379445 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.381421 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.389156 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.389184 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.389302 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.402627 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497180 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2aacea2e-e630-4280-8bed-b3b13b67f8ae-logs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497259 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-config-data\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497283 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klvpg\" (UniqueName: \"kubernetes.io/projected/e027e860-d0c0-4b1b-b02b-c374d92ae115-kube-api-access-klvpg\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497364 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497412 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497437 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497475 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp54q\" (UniqueName: \"kubernetes.io/projected/2aacea2e-e630-4280-8bed-b3b13b67f8ae-kube-api-access-dp54q\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497524 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497539 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e027e860-d0c0-4b1b-b02b-c374d92ae115-logs\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497580 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-config-data\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.497603 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.598890 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-config-data\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599168 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klvpg\" (UniqueName: \"kubernetes.io/projected/e027e860-d0c0-4b1b-b02b-c374d92ae115-kube-api-access-klvpg\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599221 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599257 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599283 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599328 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp54q\" (UniqueName: \"kubernetes.io/projected/2aacea2e-e630-4280-8bed-b3b13b67f8ae-kube-api-access-dp54q\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599358 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599385 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e027e860-d0c0-4b1b-b02b-c374d92ae115-logs\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599408 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-config-data\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599432 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.599480 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2aacea2e-e630-4280-8bed-b3b13b67f8ae-logs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.600036 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2aacea2e-e630-4280-8bed-b3b13b67f8ae-logs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.601224 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e027e860-d0c0-4b1b-b02b-c374d92ae115-logs\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.603511 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.609746 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.609959 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.610044 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.610339 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-config-data\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.610672 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-config-data\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.613809 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.615888 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klvpg\" (UniqueName: \"kubernetes.io/projected/e027e860-d0c0-4b1b-b02b-c374d92ae115-kube-api-access-klvpg\") pod \"nova-metadata-0\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " pod="openstack/nova-metadata-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.616671 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp54q\" (UniqueName: \"kubernetes.io/projected/2aacea2e-e630-4280-8bed-b3b13b67f8ae-kube-api-access-dp54q\") pod \"nova-api-0\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.711857 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:32:23 crc kubenswrapper[4750]: I1008 18:32:23.721926 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:32:24 crc kubenswrapper[4750]: I1008 18:32:24.218131 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:32:24 crc kubenswrapper[4750]: W1008 18:32:24.276167 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aacea2e_e630_4280_8bed_b3b13b67f8ae.slice/crio-0a45b0ade489c99b0ed14a68dc0acee9a4911cb2935ce629816e714af5fe4f88 WatchSource:0}: Error finding container 0a45b0ade489c99b0ed14a68dc0acee9a4911cb2935ce629816e714af5fe4f88: Status 404 returned error can't find the container with id 0a45b0ade489c99b0ed14a68dc0acee9a4911cb2935ce629816e714af5fe4f88 Oct 08 18:32:24 crc kubenswrapper[4750]: I1008 18:32:24.283371 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:32:24 crc kubenswrapper[4750]: I1008 18:32:24.745352 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df3c5c3-627e-4bfb-9dd0-16976cc25f3d" path="/var/lib/kubelet/pods/1df3c5c3-627e-4bfb-9dd0-16976cc25f3d/volumes" Oct 08 18:32:24 crc kubenswrapper[4750]: I1008 18:32:24.746478 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19173b1-326c-4494-9781-a8295882102b" path="/var/lib/kubelet/pods/f19173b1-326c-4494-9781-a8295882102b/volumes" Oct 08 18:32:25 crc kubenswrapper[4750]: I1008 18:32:25.022764 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e027e860-d0c0-4b1b-b02b-c374d92ae115","Type":"ContainerStarted","Data":"919a9750bdfbd9cb9a2ec586476afa3486a7c426aa6ae6693b2697ba44404b4a"} Oct 08 18:32:25 crc kubenswrapper[4750]: I1008 18:32:25.022832 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e027e860-d0c0-4b1b-b02b-c374d92ae115","Type":"ContainerStarted","Data":"0feb4ca16b6758c6207a2a94aa09e8a8463ace7881baffb89c3d90fa392320bd"} Oct 08 18:32:25 crc kubenswrapper[4750]: I1008 18:32:25.022843 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e027e860-d0c0-4b1b-b02b-c374d92ae115","Type":"ContainerStarted","Data":"44ad1c9d1b67d5647706f1f61c9937d24a45b59055b837ad4020d8c43132aa71"} Oct 08 18:32:25 crc kubenswrapper[4750]: I1008 18:32:25.024985 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2aacea2e-e630-4280-8bed-b3b13b67f8ae","Type":"ContainerStarted","Data":"7167710fc93b29300ba5e867d0ed8d94f5a2083a6597592a2450ba7cc525c554"} Oct 08 18:32:25 crc kubenswrapper[4750]: I1008 18:32:25.025032 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2aacea2e-e630-4280-8bed-b3b13b67f8ae","Type":"ContainerStarted","Data":"cf0d886f93ff577bb445d8aa80c9cd9e710f2e6e45c5f75d8333a920295edfa7"} Oct 08 18:32:25 crc kubenswrapper[4750]: I1008 18:32:25.025043 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2aacea2e-e630-4280-8bed-b3b13b67f8ae","Type":"ContainerStarted","Data":"0a45b0ade489c99b0ed14a68dc0acee9a4911cb2935ce629816e714af5fe4f88"} Oct 08 18:32:25 crc kubenswrapper[4750]: I1008 18:32:25.049099 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.049081505 podStartE2EDuration="2.049081505s" podCreationTimestamp="2025-10-08 18:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:32:25.043753365 +0000 UTC m=+1300.956724388" watchObservedRunningTime="2025-10-08 18:32:25.049081505 +0000 UTC m=+1300.962052518" Oct 08 18:32:25 crc kubenswrapper[4750]: I1008 18:32:25.075312 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.075289646 podStartE2EDuration="2.075289646s" podCreationTimestamp="2025-10-08 18:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:32:25.068160392 +0000 UTC m=+1300.981131425" watchObservedRunningTime="2025-10-08 18:32:25.075289646 +0000 UTC m=+1300.988260669" Oct 08 18:32:26 crc kubenswrapper[4750]: I1008 18:32:26.368796 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 18:32:28 crc kubenswrapper[4750]: I1008 18:32:28.722188 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 18:32:28 crc kubenswrapper[4750]: I1008 18:32:28.722532 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 18:32:29 crc kubenswrapper[4750]: I1008 18:32:29.707588 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:32:29 crc kubenswrapper[4750]: I1008 18:32:29.707664 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:32:30 crc kubenswrapper[4750]: I1008 18:32:30.169316 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 18:32:31 crc kubenswrapper[4750]: I1008 18:32:31.369250 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 18:32:31 crc kubenswrapper[4750]: I1008 18:32:31.408257 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 18:32:32 crc kubenswrapper[4750]: I1008 18:32:32.123195 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 18:32:33 crc kubenswrapper[4750]: I1008 18:32:33.712217 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 18:32:33 crc kubenswrapper[4750]: I1008 18:32:33.712258 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 18:32:33 crc kubenswrapper[4750]: I1008 18:32:33.722946 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 18:32:33 crc kubenswrapper[4750]: I1008 18:32:33.722976 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 18:32:34 crc kubenswrapper[4750]: I1008 18:32:34.727755 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 18:32:34 crc kubenswrapper[4750]: I1008 18:32:34.739704 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 18:32:34 crc kubenswrapper[4750]: I1008 18:32:34.739755 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 18:32:34 crc kubenswrapper[4750]: I1008 18:32:34.739885 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 18:32:43 crc kubenswrapper[4750]: I1008 18:32:43.719814 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 18:32:43 crc kubenswrapper[4750]: I1008 18:32:43.721028 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 18:32:43 crc kubenswrapper[4750]: I1008 18:32:43.722971 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 18:32:43 crc kubenswrapper[4750]: I1008 18:32:43.728184 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 18:32:43 crc kubenswrapper[4750]: I1008 18:32:43.732967 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 18:32:43 crc kubenswrapper[4750]: I1008 18:32:43.734340 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 18:32:43 crc kubenswrapper[4750]: I1008 18:32:43.734927 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 18:32:44 crc kubenswrapper[4750]: I1008 18:32:44.211188 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 18:32:44 crc kubenswrapper[4750]: I1008 18:32:44.216188 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 18:32:44 crc kubenswrapper[4750]: I1008 18:32:44.217008 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 18:32:59 crc kubenswrapper[4750]: I1008 18:32:59.706695 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:32:59 crc kubenswrapper[4750]: I1008 18:32:59.707302 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:33:04 crc kubenswrapper[4750]: I1008 18:33:04.936602 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 08 18:33:04 crc kubenswrapper[4750]: I1008 18:33:04.937290 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="845a93a1-bf5f-4820-a580-e01d2ed59416" containerName="openstackclient" containerID="cri-o://be3fa37eeeee66c52135106d0b32e1df0c8a601b9cfff61c6d2080884d388f7a" gracePeriod=2 Oct 08 18:33:04 crc kubenswrapper[4750]: I1008 18:33:04.950299 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.031303 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.140856 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6ghsk"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.141119 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-6ghsk" podUID="7c3207de-78eb-41b2-a2be-163c9a3532af" containerName="openstack-network-exporter" containerID="cri-o://08ee56a39f8f46204120acce7a7bf8c28c87e2b709294c817fd0c13ee4d4a7a9" gracePeriod=30 Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.167666 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.167743 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data podName:43a52313-747b-40a7-a7e0-9e18f3c97c42 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:05.667725069 +0000 UTC m=+1341.580696082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data") pod "rabbitmq-cell1-server-0" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42") : configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.208214 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder7a82-account-delete-rvqnc"] Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.208654 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845a93a1-bf5f-4820-a580-e01d2ed59416" containerName="openstackclient" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.208864 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="845a93a1-bf5f-4820-a580-e01d2ed59416" containerName="openstackclient" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.209087 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="845a93a1-bf5f-4820-a580-e01d2ed59416" containerName="openstackclient" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.209797 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder7a82-account-delete-rvqnc" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.224035 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mkxdr"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.242639 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder7a82-account-delete-rvqnc"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.265759 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance4c02-account-delete-z4mhj"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.266954 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4c02-account-delete-z4mhj" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.273628 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-658ll\" (UniqueName: \"kubernetes.io/projected/04d7f724-d53d-4412-bce7-cc6da81e45ac-kube-api-access-658ll\") pod \"cinder7a82-account-delete-rvqnc\" (UID: \"04d7f724-d53d-4412-bce7-cc6da81e45ac\") " pod="openstack/cinder7a82-account-delete-rvqnc" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.324907 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4c02-account-delete-z4mhj"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.341781 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.342018 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="ovn-northd" containerID="cri-o://c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" gracePeriod=30 Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.342153 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="openstack-network-exporter" containerID="cri-o://36265d856955e91441b681c8365511d7edd515654902aa5545ca450e749f6e36" gracePeriod=30 Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.376844 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hglk4\" (UniqueName: \"kubernetes.io/projected/b173b167-1fa4-45ec-98d0-16956f4b0b30-kube-api-access-hglk4\") pod \"glance4c02-account-delete-z4mhj\" (UID: \"b173b167-1fa4-45ec-98d0-16956f4b0b30\") " pod="openstack/glance4c02-account-delete-z4mhj" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.376915 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-658ll\" (UniqueName: \"kubernetes.io/projected/04d7f724-d53d-4412-bce7-cc6da81e45ac-kube-api-access-658ll\") pod \"cinder7a82-account-delete-rvqnc\" (UID: \"04d7f724-d53d-4412-bce7-cc6da81e45ac\") " pod="openstack/cinder7a82-account-delete-rvqnc" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.395341 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-57vgx"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.449266 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6ghsk_7c3207de-78eb-41b2-a2be-163c9a3532af/openstack-network-exporter/0.log" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.449490 4750 generic.go:334] "Generic (PLEG): container finished" podID="7c3207de-78eb-41b2-a2be-163c9a3532af" containerID="08ee56a39f8f46204120acce7a7bf8c28c87e2b709294c817fd0c13ee4d4a7a9" exitCode=2 Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.449519 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6ghsk" event={"ID":"7c3207de-78eb-41b2-a2be-163c9a3532af","Type":"ContainerDied","Data":"08ee56a39f8f46204120acce7a7bf8c28c87e2b709294c817fd0c13ee4d4a7a9"} Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.452673 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-658ll\" (UniqueName: \"kubernetes.io/projected/04d7f724-d53d-4412-bce7-cc6da81e45ac-kube-api-access-658ll\") pod \"cinder7a82-account-delete-rvqnc\" (UID: \"04d7f724-d53d-4412-bce7-cc6da81e45ac\") " pod="openstack/cinder7a82-account-delete-rvqnc" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.486665 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hglk4\" (UniqueName: \"kubernetes.io/projected/b173b167-1fa4-45ec-98d0-16956f4b0b30-kube-api-access-hglk4\") pod \"glance4c02-account-delete-z4mhj\" (UID: \"b173b167-1fa4-45ec-98d0-16956f4b0b30\") " pod="openstack/glance4c02-account-delete-z4mhj" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.541453 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.545151 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hglk4\" (UniqueName: \"kubernetes.io/projected/b173b167-1fa4-45ec-98d0-16956f4b0b30-kube-api-access-hglk4\") pod \"glance4c02-account-delete-z4mhj\" (UID: \"b173b167-1fa4-45ec-98d0-16956f4b0b30\") " pod="openstack/glance4c02-account-delete-z4mhj" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.555887 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-scvbb"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.581484 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement9f1c-account-delete-wx96m"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.590005 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement9f1c-account-delete-wx96m" Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.632284 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.641154 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-scvbb"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.643829 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder7a82-account-delete-rvqnc" Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.654107 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.661685 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.661849 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="ovn-northd" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.680802 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.681401 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="openstack-network-exporter" containerID="cri-o://e646c1e4d2aa9c18afad260eee7d15c40c5f3fc3f863380210aa6106334f6c32" gracePeriod=300 Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.686190 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4c02-account-delete-z4mhj" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.697758 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement9f1c-account-delete-wx96m"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.699895 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59xbx\" (UniqueName: \"kubernetes.io/projected/0c9b7a5e-9efd-45f8-86bf-730f55c077fd-kube-api-access-59xbx\") pod \"placement9f1c-account-delete-wx96m\" (UID: \"0c9b7a5e-9efd-45f8-86bf-730f55c077fd\") " pod="openstack/placement9f1c-account-delete-wx96m" Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.701185 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.701233 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data podName:43a52313-747b-40a7-a7e0-9e18f3c97c42 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:06.70121927 +0000 UTC m=+1342.614190283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data") pod "rabbitmq-cell1-server-0" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42") : configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.701262 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.701306 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data podName:5b8108eb-834c-44bd-9f39-70c348388ab6 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:06.201292452 +0000 UTC m=+1342.114263465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data") pod "rabbitmq-server-0" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6") : configmap "rabbitmq-config-data" not found Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.719602 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican5ee7-account-delete-nlrfw"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.723501 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5ee7-account-delete-nlrfw" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.749625 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican5ee7-account-delete-nlrfw"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.773950 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6khnd"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.798695 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6khnd"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.802243 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2wmq\" (UniqueName: \"kubernetes.io/projected/c751481b-5934-4262-84ff-106498a453e0-kube-api-access-f2wmq\") pod \"barbican5ee7-account-delete-nlrfw\" (UID: \"c751481b-5934-4262-84ff-106498a453e0\") " pod="openstack/barbican5ee7-account-delete-nlrfw" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.802320 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59xbx\" (UniqueName: \"kubernetes.io/projected/0c9b7a5e-9efd-45f8-86bf-730f55c077fd-kube-api-access-59xbx\") pod \"placement9f1c-account-delete-wx96m\" (UID: \"0c9b7a5e-9efd-45f8-86bf-730f55c077fd\") " pod="openstack/placement9f1c-account-delete-wx96m" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.820427 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rlsgw"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.837449 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59xbx\" (UniqueName: \"kubernetes.io/projected/0c9b7a5e-9efd-45f8-86bf-730f55c077fd-kube-api-access-59xbx\") pod \"placement9f1c-account-delete-wx96m\" (UID: \"0c9b7a5e-9efd-45f8-86bf-730f55c077fd\") " pod="openstack/placement9f1c-account-delete-wx96m" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.844662 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rlsgw"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.857903 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron0c01-account-delete-rfw7g"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.884160 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron0c01-account-delete-rfw7g"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.884686 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0c01-account-delete-rfw7g" Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.906137 4750 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-mkxdr" message=< Oct 08 18:33:05 crc kubenswrapper[4750]: Exiting ovn-controller (1) [ OK ] Oct 08 18:33:05 crc kubenswrapper[4750]: > Oct 08 18:33:05 crc kubenswrapper[4750]: E1008 18:33:05.906391 4750 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-mkxdr" podUID="cc808a1a-9703-4009-8d81-e555a8e25929" containerName="ovn-controller" containerID="cri-o://207cabb8cb0acf23e4bf4e62948733f47bedb3c82673a966db5aa8e4ed65d14b" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.906423 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-mkxdr" podUID="cc808a1a-9703-4009-8d81-e555a8e25929" containerName="ovn-controller" containerID="cri-o://207cabb8cb0acf23e4bf4e62948733f47bedb3c82673a966db5aa8e4ed65d14b" gracePeriod=30 Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.906767 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2wmq\" (UniqueName: \"kubernetes.io/projected/c751481b-5934-4262-84ff-106498a453e0-kube-api-access-f2wmq\") pod \"barbican5ee7-account-delete-nlrfw\" (UID: \"c751481b-5934-4262-84ff-106498a453e0\") " pod="openstack/barbican5ee7-account-delete-nlrfw" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.929278 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapid8db-account-delete-7phjk"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.930625 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapid8db-account-delete-7phjk" Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.944513 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapid8db-account-delete-7phjk"] Oct 08 18:33:05 crc kubenswrapper[4750]: I1008 18:33:05.976371 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2wmq\" (UniqueName: \"kubernetes.io/projected/c751481b-5934-4262-84ff-106498a453e0-kube-api-access-f2wmq\") pod \"barbican5ee7-account-delete-nlrfw\" (UID: \"c751481b-5934-4262-84ff-106498a453e0\") " pod="openstack/barbican5ee7-account-delete-nlrfw" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.011665 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="ovsdbserver-sb" containerID="cri-o://74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330" gracePeriod=300 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.012618 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement9f1c-account-delete-wx96m" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.013656 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grgss\" (UniqueName: \"kubernetes.io/projected/736a9e30-a3de-4e9f-9de7-52015e55443e-kube-api-access-grgss\") pod \"novaapid8db-account-delete-7phjk\" (UID: \"736a9e30-a3de-4e9f-9de7-52015e55443e\") " pod="openstack/novaapid8db-account-delete-7phjk" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.020003 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hr9\" (UniqueName: \"kubernetes.io/projected/af33a9f0-c575-46f6-a3cd-71391d454430-kube-api-access-85hr9\") pod \"neutron0c01-account-delete-rfw7g\" (UID: \"af33a9f0-c575-46f6-a3cd-71391d454430\") " pod="openstack/neutron0c01-account-delete-rfw7g" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.029313 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.029911 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" containerName="openstack-network-exporter" containerID="cri-o://d8bac502dbc380032817c12bba89e9f6cc302b44cbc72b0455a5c8be03b7c626" gracePeriod=300 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.038612 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ndlps"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.076211 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ndlps"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.129056 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grgss\" (UniqueName: \"kubernetes.io/projected/736a9e30-a3de-4e9f-9de7-52015e55443e-kube-api-access-grgss\") pod \"novaapid8db-account-delete-7phjk\" (UID: \"736a9e30-a3de-4e9f-9de7-52015e55443e\") " pod="openstack/novaapid8db-account-delete-7phjk" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.129304 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hr9\" (UniqueName: \"kubernetes.io/projected/af33a9f0-c575-46f6-a3cd-71391d454430-kube-api-access-85hr9\") pod \"neutron0c01-account-delete-rfw7g\" (UID: \"af33a9f0-c575-46f6-a3cd-71391d454430\") " pod="openstack/neutron0c01-account-delete-rfw7g" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.192840 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grgss\" (UniqueName: \"kubernetes.io/projected/736a9e30-a3de-4e9f-9de7-52015e55443e-kube-api-access-grgss\") pod \"novaapid8db-account-delete-7phjk\" (UID: \"736a9e30-a3de-4e9f-9de7-52015e55443e\") " pod="openstack/novaapid8db-account-delete-7phjk" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.193185 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dd8dn"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.223812 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5ee7-account-delete-nlrfw" Oct 08 18:33:06 crc kubenswrapper[4750]: E1008 18:33:06.240676 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 18:33:06 crc kubenswrapper[4750]: E1008 18:33:06.240750 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data podName:5b8108eb-834c-44bd-9f39-70c348388ab6 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:07.240733639 +0000 UTC m=+1343.153704652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data") pod "rabbitmq-server-0" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6") : configmap "rabbitmq-config-data" not found Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.242427 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hr9\" (UniqueName: \"kubernetes.io/projected/af33a9f0-c575-46f6-a3cd-71391d454430-kube-api-access-85hr9\") pod \"neutron0c01-account-delete-rfw7g\" (UID: \"af33a9f0-c575-46f6-a3cd-71391d454430\") " pod="openstack/neutron0c01-account-delete-rfw7g" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.263688 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dd8dn"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.312983 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0c01-account-delete-rfw7g" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.347579 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6ghsk_7c3207de-78eb-41b2-a2be-163c9a3532af/openstack-network-exporter/0.log" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.347879 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.348138 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapid8db-account-delete-7phjk" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.353211 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-84nq2"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.353594 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" podUID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" containerName="dnsmasq-dns" containerID="cri-o://b4f13f7200451000f175c62b160ddb55de7cf3bd3d9beaaa3673047fa05bb9c7" gracePeriod=10 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.483482 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovs-rundir\") pod \"7c3207de-78eb-41b2-a2be-163c9a3532af\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.483528 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovn-rundir\") pod \"7c3207de-78eb-41b2-a2be-163c9a3532af\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.483591 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-combined-ca-bundle\") pod \"7c3207de-78eb-41b2-a2be-163c9a3532af\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.483625 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-metrics-certs-tls-certs\") pod \"7c3207de-78eb-41b2-a2be-163c9a3532af\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.483683 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbx7d\" (UniqueName: \"kubernetes.io/projected/7c3207de-78eb-41b2-a2be-163c9a3532af-kube-api-access-mbx7d\") pod \"7c3207de-78eb-41b2-a2be-163c9a3532af\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.483712 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c3207de-78eb-41b2-a2be-163c9a3532af-config\") pod \"7c3207de-78eb-41b2-a2be-163c9a3532af\" (UID: \"7c3207de-78eb-41b2-a2be-163c9a3532af\") " Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.486462 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3207de-78eb-41b2-a2be-163c9a3532af-config" (OuterVolumeSpecName: "config") pod "7c3207de-78eb-41b2-a2be-163c9a3532af" (UID: "7c3207de-78eb-41b2-a2be-163c9a3532af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.486571 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "7c3207de-78eb-41b2-a2be-163c9a3532af" (UID: "7c3207de-78eb-41b2-a2be-163c9a3532af"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.486674 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7c3207de-78eb-41b2-a2be-163c9a3532af" (UID: "7c3207de-78eb-41b2-a2be-163c9a3532af"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.525700 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.525974 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api-log" containerID="cri-o://ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.526379 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api" containerID="cri-o://e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.528516 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3207de-78eb-41b2-a2be-163c9a3532af-kube-api-access-mbx7d" (OuterVolumeSpecName: "kube-api-access-mbx7d") pod "7c3207de-78eb-41b2-a2be-163c9a3532af" (UID: "7c3207de-78eb-41b2-a2be-163c9a3532af"). InnerVolumeSpecName "kube-api-access-mbx7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.549722 4750 generic.go:334] "Generic (PLEG): container finished" podID="cc808a1a-9703-4009-8d81-e555a8e25929" containerID="207cabb8cb0acf23e4bf4e62948733f47bedb3c82673a966db5aa8e4ed65d14b" exitCode=0 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.549971 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" containerName="ovsdbserver-nb" containerID="cri-o://e8ed3cdde57c4534decef125e57e81c8e9ddb189aeb9195d0d45761ded957615" gracePeriod=300 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.550874 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.550906 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mkxdr" event={"ID":"cc808a1a-9703-4009-8d81-e555a8e25929","Type":"ContainerDied","Data":"207cabb8cb0acf23e4bf4e62948733f47bedb3c82673a966db5aa8e4ed65d14b"} Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.551079 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerName="cinder-scheduler" containerID="cri-o://531a039dc690be8ef20b0b6f5062a698798a803816f610a232671935cec2a8cc" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.551401 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerName="probe" containerID="cri-o://be50775e9b6d25d8067aa8220a05db2eb7a3fed17a5981e8b008927ca24a65a9" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.572012 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-fnr22"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.579333 4750 generic.go:334] "Generic (PLEG): container finished" podID="230c02f8-af60-40d6-af19-adf730eec43f" containerID="36265d856955e91441b681c8365511d7edd515654902aa5545ca450e749f6e36" exitCode=2 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.579426 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"230c02f8-af60-40d6-af19-adf730eec43f","Type":"ContainerDied","Data":"36265d856955e91441b681c8365511d7edd515654902aa5545ca450e749f6e36"} Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.582676 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-fnr22"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.584234 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c3207de-78eb-41b2-a2be-163c9a3532af" (UID: "7c3207de-78eb-41b2-a2be-163c9a3532af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.585494 4750 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.585510 4750 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7c3207de-78eb-41b2-a2be-163c9a3532af-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.585518 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.585527 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbx7d\" (UniqueName: \"kubernetes.io/projected/7c3207de-78eb-41b2-a2be-163c9a3532af-kube-api-access-mbx7d\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.585535 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c3207de-78eb-41b2-a2be-163c9a3532af-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.590988 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6ghsk_7c3207de-78eb-41b2-a2be-163c9a3532af/openstack-network-exporter/0.log" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.591062 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6ghsk" event={"ID":"7c3207de-78eb-41b2-a2be-163c9a3532af","Type":"ContainerDied","Data":"705442df5471d8d9123ad6eb4f460952fe7c2f7c1d2b7f92093a2813227350d6"} Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.591097 4750 scope.go:117] "RemoveContainer" containerID="08ee56a39f8f46204120acce7a7bf8c28c87e2b709294c817fd0c13ee4d4a7a9" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.591193 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6ghsk" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.594035 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-788b97745d-6snpn"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.594234 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-788b97745d-6snpn" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerName="placement-log" containerID="cri-o://945094b64faf05a196a97d662a4c3e8b64a2ebe7a2311be55ef9549fcca90849" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.594328 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-788b97745d-6snpn" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerName="placement-api" containerID="cri-o://f46e6c1f5784d1ac90a72cce013512954360b873208916edc44bc5574486392e" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.623049 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_590d851e-4648-48db-b385-aaa732f5c787/ovsdbserver-sb/0.log" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.623093 4750 generic.go:334] "Generic (PLEG): container finished" podID="590d851e-4648-48db-b385-aaa732f5c787" containerID="e646c1e4d2aa9c18afad260eee7d15c40c5f3fc3f863380210aa6106334f6c32" exitCode=2 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.623111 4750 generic.go:334] "Generic (PLEG): container finished" podID="590d851e-4648-48db-b385-aaa732f5c787" containerID="74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330" exitCode=143 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.623131 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"590d851e-4648-48db-b385-aaa732f5c787","Type":"ContainerDied","Data":"e646c1e4d2aa9c18afad260eee7d15c40c5f3fc3f863380210aa6106334f6c32"} Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.623158 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"590d851e-4648-48db-b385-aaa732f5c787","Type":"ContainerDied","Data":"74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330"} Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.632609 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2klwz"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.666076 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2klwz"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.687568 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.687777 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-log" containerID="cri-o://f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.688086 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-httpd" containerID="cri-o://3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.724494 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ftcwm"] Oct 08 18:33:06 crc kubenswrapper[4750]: E1008 18:33:06.801398 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:06 crc kubenswrapper[4750]: E1008 18:33:06.802452 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data podName:43a52313-747b-40a7-a7e0-9e18f3c97c42 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:08.802411129 +0000 UTC m=+1344.715382142 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data") pod "rabbitmq-cell1-server-0" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42") : configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.828996 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b4bf15-850b-4082-8053-aee61b75dc58" path="/var/lib/kubelet/pods/41b4bf15-850b-4082-8053-aee61b75dc58/volumes" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.829764 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dca9e23-fccb-4dae-98c1-f6670caac28c" path="/var/lib/kubelet/pods/5dca9e23-fccb-4dae-98c1-f6670caac28c/volumes" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.830383 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d635eca-619c-4f52-a9d9-73b42d845fbf" path="/var/lib/kubelet/pods/7d635eca-619c-4f52-a9d9-73b42d845fbf/volumes" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.869416 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae9b192-27dc-4bad-a9ae-6e03824c59f0" path="/var/lib/kubelet/pods/aae9b192-27dc-4bad-a9ae-6e03824c59f0/volumes" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.893739 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7ca8f71-a72e-4a6d-b839-5524cfb3d70b" path="/var/lib/kubelet/pods/b7ca8f71-a72e-4a6d-b839-5524cfb3d70b/volumes" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.894285 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdfe0b7d-f014-4453-b94c-3842ebdd4052" path="/var/lib/kubelet/pods/bdfe0b7d-f014-4453-b94c-3842ebdd4052/volumes" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.894787 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e2d08e-75b7-445b-b563-affabf6d8af6" path="/var/lib/kubelet/pods/c9e2d08e-75b7-445b-b563-affabf6d8af6/volumes" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.933225 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7c3207de-78eb-41b2-a2be-163c9a3532af" (UID: "7c3207de-78eb-41b2-a2be-163c9a3532af"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.936268 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ftcwm"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.936325 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.940211 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerName="glance-httpd" containerID="cri-o://e8cbb35efffa0dfc5317921463dc9d7e30007f7e471dd1e64c7dfbb694f8a5d9" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.936535 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerName="glance-log" containerID="cri-o://ea10ae3da743eb1f3f0c357c0ff737b1ceb0a0e3e2a2d97d9e6932ea21abcefc" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.984860 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985280 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-server" containerID="cri-o://d6cdce1fc3c750a32568c75f5f41fd036fc0c6d012614ac48dd3b89a28bddb19" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985685 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="swift-recon-cron" containerID="cri-o://427cf714364cd21ce5dde409e29dd3aa65f33832204dfbf8ce289255b5e834c0" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985734 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="rsync" containerID="cri-o://d2840db45a1d2b1e671a91bd14ec16767e2beb441096549c8e394b831dc54350" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985765 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-expirer" containerID="cri-o://3094942c3642c799d4db13b77a9964692bca3d841f52ece0573854c311f0f401" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985791 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-updater" containerID="cri-o://3013b58e1d5ba309156a64f8003ce218400fb7a18be0cb766461f307776c2a76" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985821 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-auditor" containerID="cri-o://1d96354096952d01c21e9fe7d3f0bcc4a2f331e8ac84cd83d2b48f90d58cc965" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985849 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-replicator" containerID="cri-o://b8997ccfc894343c6e131268824155df3c9f2e6d480b351d3f97c9aa1cd88c3f" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985886 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-server" containerID="cri-o://26c529adb950717fbb00bdeba5f0a5ebfb00cbe7017189d59c51fb3fb5a903e5" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985918 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-updater" containerID="cri-o://1a4ea5b2811f4515b0049bf0da3ae03b49c47b62fb6d2cddab227c8e93827963" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985950 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-auditor" containerID="cri-o://c068b85bb1ced1a9426fda1897141d98518bc162a709c23d2054cd4d7cce8209" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.985985 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-replicator" containerID="cri-o://ec560a5ed96d75041eddd3e8cea8a27ed5d0b038772f918091c1f3453402ac4f" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.986021 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-server" containerID="cri-o://0daa177c260aa7c024a5646c1d578e111a41bb913177c6089099d8076dcc1b13" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.986050 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-reaper" containerID="cri-o://f20129167bf9bc7eee86dd0ea9d1df44c416086aba87c4cfefb4ac0bb12dca9c" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.986077 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-auditor" containerID="cri-o://52f4f1b7fcee564a61f7c59907f9f37ce855f6b9ccb07d9b5d4dea7ac7e8c754" gracePeriod=30 Oct 08 18:33:06 crc kubenswrapper[4750]: I1008 18:33:06.986102 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-replicator" containerID="cri-o://db381222476cb51a6b70b126857ee5e5eff04de66df2645f6737aa6584b15305" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.040414 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8d8895f6c-zszml"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.040698 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8d8895f6c-zszml" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-api" containerID="cri-o://f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.041120 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8d8895f6c-zszml" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-httpd" containerID="cri-o://c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.042814 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3207de-78eb-41b2-a2be-163c9a3532af-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.069079 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.093302 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder7a82-account-delete-rvqnc"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.115633 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-llnpr"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.122453 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-llnpr"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.136050 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7a82-account-create-cwfdm"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.155081 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-74b95c857c-677wg"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.155336 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-74b95c857c-677wg" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerName="proxy-httpd" containerID="cri-o://0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.155792 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-74b95c857c-677wg" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerName="proxy-server" containerID="cri-o://726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.162028 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7a82-account-create-cwfdm"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.226819 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pbmmr"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.236499 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pbmmr"] Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.250731 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.250802 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data podName:5b8108eb-834c-44bd-9f39-70c348388ab6 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:09.25078462 +0000 UTC m=+1345.163755633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data") pod "rabbitmq-server-0" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6") : configmap "rabbitmq-config-data" not found Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.263920 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4c02-account-delete-z4mhj"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.300253 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-b49zx"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.323653 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4c02-account-create-ht8qf"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.335907 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-b49zx"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.385688 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4c02-account-create-ht8qf"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.390888 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.399389 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cjhvk"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.407298 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cjhvk"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.414101 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5ee7-account-create-6qh5z"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.437599 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5ee7-account-create-6qh5z"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.469650 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement9f1c-account-delete-wx96m"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.493809 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican5ee7-account-delete-nlrfw"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.522628 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9f1c-account-create-6n79f"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.535772 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9f1c-account-create-6n79f"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.543022 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-kp6xp"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.549897 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-kp6xp"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.611990 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.612244 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-log" containerID="cri-o://cf0d886f93ff577bb445d8aa80c9cd9e710f2e6e45c5f75d8333a920295edfa7" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.612695 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-api" containerID="cri-o://7167710fc93b29300ba5e867d0ed8d94f5a2083a6597592a2450ba7cc525c554" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.653083 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerName="rabbitmq" containerID="cri-o://0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd" gracePeriod=604800 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.686360 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0c01-account-create-wk5nl"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.686519 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" containerID="cri-o://6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" gracePeriod=28 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.691243 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_235930b1-672c-4fc6-bbb4-78204c591aee/ovsdbserver-nb/0.log" Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.691280 4750 generic.go:334] "Generic (PLEG): container finished" podID="235930b1-672c-4fc6-bbb4-78204c591aee" containerID="d8bac502dbc380032817c12bba89e9f6cc302b44cbc72b0455a5c8be03b7c626" exitCode=2 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.691297 4750 generic.go:334] "Generic (PLEG): container finished" podID="235930b1-672c-4fc6-bbb4-78204c591aee" containerID="e8ed3cdde57c4534decef125e57e81c8e9ddb189aeb9195d0d45761ded957615" exitCode=143 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.691374 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"235930b1-672c-4fc6-bbb4-78204c591aee","Type":"ContainerDied","Data":"d8bac502dbc380032817c12bba89e9f6cc302b44cbc72b0455a5c8be03b7c626"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.691400 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"235930b1-672c-4fc6-bbb4-78204c591aee","Type":"ContainerDied","Data":"e8ed3cdde57c4534decef125e57e81c8e9ddb189aeb9195d0d45761ded957615"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.715636 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0c01-account-create-wk5nl"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.724731 4750 generic.go:334] "Generic (PLEG): container finished" podID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerID="f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9" exitCode=143 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.724831 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"899027b7-067b-4ce1-a8f1-deaee627aa51","Type":"ContainerDied","Data":"f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.734136 4750 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-conductor-0" secret="" err="secret \"nova-nova-dockercfg-b8qmk\" not found" Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.736170 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mkxdr" event={"ID":"cc808a1a-9703-4009-8d81-e555a8e25929","Type":"ContainerDied","Data":"4a772474d6df97cd4f6099df8b8304bbd40360378e40889137f79e4742eb9c2d"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.736192 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a772474d6df97cd4f6099df8b8304bbd40360378e40889137f79e4742eb9c2d" Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.750326 4750 generic.go:334] "Generic (PLEG): container finished" podID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerID="945094b64faf05a196a97d662a4c3e8b64a2ebe7a2311be55ef9549fcca90849" exitCode=143 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.750399 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788b97745d-6snpn" event={"ID":"ec1950dc-6caf-45f7-9b18-8c12db1b3f25","Type":"ContainerDied","Data":"945094b64faf05a196a97d662a4c3e8b64a2ebe7a2311be55ef9549fcca90849"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.767623 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0c01-account-delete-rfw7g"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.780035 4750 generic.go:334] "Generic (PLEG): container finished" podID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" containerID="b4f13f7200451000f175c62b160ddb55de7cf3bd3d9beaaa3673047fa05bb9c7" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.780108 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tqdjh"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.780130 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" event={"ID":"bfa3814f-a0f4-4d53-9c08-44d7b45dd662","Type":"ContainerDied","Data":"b4f13f7200451000f175c62b160ddb55de7cf3bd3d9beaaa3673047fa05bb9c7"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.780149 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" event={"ID":"bfa3814f-a0f4-4d53-9c08-44d7b45dd662","Type":"ContainerDied","Data":"549dcd959f15f574b0307661b07bb0215a375979613dfe4cd26c7e6bb3eec6ba"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.780159 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="549dcd959f15f574b0307661b07bb0215a375979613dfe4cd26c7e6bb3eec6ba" Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.781594 4750 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.781657 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data podName:af09729b-3284-4dcd-91a1-5763d28daaf5 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:08.281642796 +0000 UTC m=+1344.194613809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data") pod "nova-cell1-conductor-0" (UID: "af09729b-3284-4dcd-91a1-5763d28daaf5") : secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.800818 4750 generic.go:334] "Generic (PLEG): container finished" podID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerID="ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4" exitCode=143 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.800903 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f1b5ad2e-1ee1-4955-99c2-8daed456b21c","Type":"ContainerDied","Data":"ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.807034 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a9e2-account-create-btj9m"] Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.809773 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330 is running failed: container process not found" containerID="74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.811672 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330 is running failed: container process not found" containerID="74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.814792 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330 is running failed: container process not found" containerID="74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.814842 4750 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="ovsdbserver-sb" Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.818098 4750 generic.go:334] "Generic (PLEG): container finished" podID="845a93a1-bf5f-4820-a580-e01d2ed59416" containerID="be3fa37eeeee66c52135106d0b32e1df0c8a601b9cfff61c6d2080884d388f7a" exitCode=137 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.821784 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tqdjh"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.827216 4750 generic.go:334] "Generic (PLEG): container finished" podID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerID="ea10ae3da743eb1f3f0c357c0ff737b1ceb0a0e3e2a2d97d9e6932ea21abcefc" exitCode=143 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.827335 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a25ebe44-c330-48f8-9df7-5f8517cd96bd","Type":"ContainerDied","Data":"ea10ae3da743eb1f3f0c357c0ff737b1ceb0a0e3e2a2d97d9e6932ea21abcefc"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.835850 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a9e2-account-create-btj9m"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.845804 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.846008 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-log" containerID="cri-o://0feb4ca16b6758c6207a2a94aa09e8a8463ace7881baffb89c3d90fa392320bd" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.846370 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-metadata" containerID="cri-o://919a9750bdfbd9cb9a2ec586476afa3486a7c426aa6ae6693b2697ba44404b4a" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.860160 4750 generic.go:334] "Generic (PLEG): container finished" podID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerID="c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.860272 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ljvpd"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.860368 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d8895f6c-zszml" event={"ID":"7b751e62-8a05-413c-9f82-e9f28230e5ba","Type":"ContainerDied","Data":"c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.868223 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ljvpd"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.878388 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ghwp7"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.886794 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ghwp7"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.902920 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88ae-account-create-dzf7x"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.907882 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-88ae-account-create-dzf7x"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.915338 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d8db-account-create-hswqx"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.922489 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d8db-account-create-hswqx"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.931228 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-587d5f7b59-ws4tc"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.932414 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-587d5f7b59-ws4tc" podUID="28550569-4c3c-48cf-a621-eddec0919b51" containerName="barbican-worker-log" containerID="cri-o://3dac308349d8e3040066022cf2273a5c40cf9dfc274269fe30212fe125c4aad8" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.932837 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-587d5f7b59-ws4tc" podUID="28550569-4c3c-48cf-a621-eddec0919b51" containerName="barbican-worker" containerID="cri-o://1a151318fef849fd5d62d1d54d979f4d4b38f1111627d51da48a87b928387362" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.938459 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-689cf77786-nkzv6"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.940694 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerName="barbican-keystone-listener-log" containerID="cri-o://f8c62ba9b4feac7a06f75a6ce1e507df508740e668c07ed3f788844322a15de7" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.940925 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerName="barbican-keystone-listener" containerID="cri-o://b88fc77d21b840724590332b8c3c0a46a81b3b88285cff5ea88ef4c54f55690c" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.949421 4750 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 08 18:33:07 crc kubenswrapper[4750]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 18:33:07 crc kubenswrapper[4750]: + source /usr/local/bin/container-scripts/functions Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNBridge=br-int Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNRemote=tcp:localhost:6642 Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNEncapType=geneve Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNAvailabilityZones= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ EnableChassisAsGateway=true Oct 08 18:33:07 crc kubenswrapper[4750]: ++ PhysicalNetworks= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNHostName= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 18:33:07 crc kubenswrapper[4750]: ++ ovs_dir=/var/lib/openvswitch Oct 08 18:33:07 crc kubenswrapper[4750]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 18:33:07 crc kubenswrapper[4750]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 18:33:07 crc kubenswrapper[4750]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + cleanup_ovsdb_server_semaphore Oct 08 18:33:07 crc kubenswrapper[4750]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 18:33:07 crc kubenswrapper[4750]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 18:33:07 crc kubenswrapper[4750]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-57vgx" message=< Oct 08 18:33:07 crc kubenswrapper[4750]: Exiting ovsdb-server (5) [ OK ] Oct 08 18:33:07 crc kubenswrapper[4750]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 18:33:07 crc kubenswrapper[4750]: + source /usr/local/bin/container-scripts/functions Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNBridge=br-int Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNRemote=tcp:localhost:6642 Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNEncapType=geneve Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNAvailabilityZones= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ EnableChassisAsGateway=true Oct 08 18:33:07 crc kubenswrapper[4750]: ++ PhysicalNetworks= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNHostName= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 18:33:07 crc kubenswrapper[4750]: ++ ovs_dir=/var/lib/openvswitch Oct 08 18:33:07 crc kubenswrapper[4750]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 18:33:07 crc kubenswrapper[4750]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 18:33:07 crc kubenswrapper[4750]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + cleanup_ovsdb_server_semaphore Oct 08 18:33:07 crc kubenswrapper[4750]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 18:33:07 crc kubenswrapper[4750]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 18:33:07 crc kubenswrapper[4750]: > Oct 08 18:33:07 crc kubenswrapper[4750]: E1008 18:33:07.949465 4750 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 08 18:33:07 crc kubenswrapper[4750]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 18:33:07 crc kubenswrapper[4750]: + source /usr/local/bin/container-scripts/functions Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNBridge=br-int Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNRemote=tcp:localhost:6642 Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNEncapType=geneve Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNAvailabilityZones= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ EnableChassisAsGateway=true Oct 08 18:33:07 crc kubenswrapper[4750]: ++ PhysicalNetworks= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ OVNHostName= Oct 08 18:33:07 crc kubenswrapper[4750]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 18:33:07 crc kubenswrapper[4750]: ++ ovs_dir=/var/lib/openvswitch Oct 08 18:33:07 crc kubenswrapper[4750]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 18:33:07 crc kubenswrapper[4750]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 18:33:07 crc kubenswrapper[4750]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + sleep 0.5 Oct 08 18:33:07 crc kubenswrapper[4750]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 18:33:07 crc kubenswrapper[4750]: + cleanup_ovsdb_server_semaphore Oct 08 18:33:07 crc kubenswrapper[4750]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 18:33:07 crc kubenswrapper[4750]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 18:33:07 crc kubenswrapper[4750]: > pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" containerID="cri-o://afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.949496 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" containerID="cri-o://afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" gracePeriod=28 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.954747 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapid8db-account-delete-7phjk"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958753 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="d2840db45a1d2b1e671a91bd14ec16767e2beb441096549c8e394b831dc54350" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958776 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="3094942c3642c799d4db13b77a9964692bca3d841f52ece0573854c311f0f401" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958785 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="3013b58e1d5ba309156a64f8003ce218400fb7a18be0cb766461f307776c2a76" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958794 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="1d96354096952d01c21e9fe7d3f0bcc4a2f331e8ac84cd83d2b48f90d58cc965" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958802 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="b8997ccfc894343c6e131268824155df3c9f2e6d480b351d3f97c9aa1cd88c3f" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958808 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="26c529adb950717fbb00bdeba5f0a5ebfb00cbe7017189d59c51fb3fb5a903e5" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958815 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="1a4ea5b2811f4515b0049bf0da3ae03b49c47b62fb6d2cddab227c8e93827963" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958822 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="c068b85bb1ced1a9426fda1897141d98518bc162a709c23d2054cd4d7cce8209" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958828 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="ec560a5ed96d75041eddd3e8cea8a27ed5d0b038772f918091c1f3453402ac4f" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958834 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="f20129167bf9bc7eee86dd0ea9d1df44c416086aba87c4cfefb4ac0bb12dca9c" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958840 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="52f4f1b7fcee564a61f7c59907f9f37ce855f6b9ccb07d9b5d4dea7ac7e8c754" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958848 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="db381222476cb51a6b70b126857ee5e5eff04de66df2645f6737aa6584b15305" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958855 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="d6cdce1fc3c750a32568c75f5f41fd036fc0c6d012614ac48dd3b89a28bddb19" exitCode=0 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958909 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"d2840db45a1d2b1e671a91bd14ec16767e2beb441096549c8e394b831dc54350"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958934 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"3094942c3642c799d4db13b77a9964692bca3d841f52ece0573854c311f0f401"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958943 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"3013b58e1d5ba309156a64f8003ce218400fb7a18be0cb766461f307776c2a76"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958951 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"1d96354096952d01c21e9fe7d3f0bcc4a2f331e8ac84cd83d2b48f90d58cc965"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958960 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"b8997ccfc894343c6e131268824155df3c9f2e6d480b351d3f97c9aa1cd88c3f"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958968 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"26c529adb950717fbb00bdeba5f0a5ebfb00cbe7017189d59c51fb3fb5a903e5"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958977 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"1a4ea5b2811f4515b0049bf0da3ae03b49c47b62fb6d2cddab227c8e93827963"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958987 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"c068b85bb1ced1a9426fda1897141d98518bc162a709c23d2054cd4d7cce8209"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.958995 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"ec560a5ed96d75041eddd3e8cea8a27ed5d0b038772f918091c1f3453402ac4f"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.959004 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"f20129167bf9bc7eee86dd0ea9d1df44c416086aba87c4cfefb4ac0bb12dca9c"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.959013 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"52f4f1b7fcee564a61f7c59907f9f37ce855f6b9ccb07d9b5d4dea7ac7e8c754"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.959021 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"db381222476cb51a6b70b126857ee5e5eff04de66df2645f6737aa6584b15305"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.959030 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"d6cdce1fc3c750a32568c75f5f41fd036fc0c6d012614ac48dd3b89a28bddb19"} Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.965855 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-594d9fc688-28msd"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.966122 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-594d9fc688-28msd" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api-log" containerID="cri-o://653d2c90232b23be484b8fc7b55378350753180183fba74f1dbd9d6ac1dde268" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.966239 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-594d9fc688-28msd" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api" containerID="cri-o://2d2ea6e9b74498814dcf6fb556ced530c433e9440f72f77d7890312774245a65" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.971921 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="2feb2439-d911-4585-a5e1-671abcfa357d" containerName="galera" containerID="cri-o://131e51f9cac5dc8a9c0d425746e3b9ee7a9d53950bcb5870bac43328508a3398" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.980089 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.990227 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.990478 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="c4cbc20b-7898-4a47-99f6-80436897042c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3f09bf21e5c4d3c261d96fe4ae9e3aa7f61c0a08319b33de766658446000ec50" gracePeriod=30 Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.997603 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:33:07 crc kubenswrapper[4750]: I1008 18:33:07.997794 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dba17973-3023-43ae-9b75-a8e1dc7f16cc" containerName="nova-scheduler-scheduler" containerID="cri-o://295b9725591cb856aaa6df5c90337486a2c9085eabdd2a6955b57c6bb2f78110" gracePeriod=30 Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.005521 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kxxb9"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.012833 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kxxb9"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.024324 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.028998 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.029197 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="038b3881-b266-4878-b395-87d7bf986446" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0" gracePeriod=30 Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.031672 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerName="rabbitmq" containerID="cri-o://521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21" gracePeriod=604800 Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.037265 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mkxdr" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.038662 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9n56l"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.041133 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_590d851e-4648-48db-b385-aaa732f5c787/ovsdbserver-sb/0.log" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.041238 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.041396 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.042826 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.047743 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9n56l"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.052656 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_235930b1-672c-4fc6-bbb4-78204c591aee/ovsdbserver-nb/0.log" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.052730 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.057442 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6ghsk"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.067618 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-6ghsk"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106459 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590d851e-4648-48db-b385-aaa732f5c787-ovsdb-rundir\") pod \"590d851e-4648-48db-b385-aaa732f5c787\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106484 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"590d851e-4648-48db-b385-aaa732f5c787\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106503 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-svc\") pod \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106527 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-swift-storage-0\") pod \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106579 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-config\") pod \"235930b1-672c-4fc6-bbb4-78204c591aee\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106634 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config-secret\") pod \"845a93a1-bf5f-4820-a580-e01d2ed59416\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106663 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-log-ovn\") pod \"cc808a1a-9703-4009-8d81-e555a8e25929\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106680 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-nb\") pod \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106697 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-metrics-certs-tls-certs\") pod \"235930b1-672c-4fc6-bbb4-78204c591aee\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106717 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-config\") pod \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106736 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfqlx\" (UniqueName: \"kubernetes.io/projected/cc808a1a-9703-4009-8d81-e555a8e25929-kube-api-access-pfqlx\") pod \"cc808a1a-9703-4009-8d81-e555a8e25929\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106748 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run\") pod \"cc808a1a-9703-4009-8d81-e555a8e25929\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106782 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-scripts\") pod \"590d851e-4648-48db-b385-aaa732f5c787\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106800 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-combined-ca-bundle\") pod \"590d851e-4648-48db-b385-aaa732f5c787\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106823 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-metrics-certs-tls-certs\") pod \"590d851e-4648-48db-b385-aaa732f5c787\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106842 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-sb\") pod \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106861 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tqml\" (UniqueName: \"kubernetes.io/projected/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-kube-api-access-8tqml\") pod \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\" (UID: \"bfa3814f-a0f4-4d53-9c08-44d7b45dd662\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106913 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hkkr\" (UniqueName: \"kubernetes.io/projected/590d851e-4648-48db-b385-aaa732f5c787-kube-api-access-6hkkr\") pod \"590d851e-4648-48db-b385-aaa732f5c787\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106931 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-scripts\") pod \"235930b1-672c-4fc6-bbb4-78204c591aee\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106945 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn85c\" (UniqueName: \"kubernetes.io/projected/235930b1-672c-4fc6-bbb4-78204c591aee-kube-api-access-jn85c\") pod \"235930b1-672c-4fc6-bbb4-78204c591aee\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106957 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run-ovn\") pod \"cc808a1a-9703-4009-8d81-e555a8e25929\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.106988 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-combined-ca-bundle\") pod \"235930b1-672c-4fc6-bbb4-78204c591aee\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107003 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config\") pod \"845a93a1-bf5f-4820-a580-e01d2ed59416\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107024 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdbserver-nb-tls-certs\") pod \"235930b1-672c-4fc6-bbb4-78204c591aee\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107041 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc808a1a-9703-4009-8d81-e555a8e25929-scripts\") pod \"cc808a1a-9703-4009-8d81-e555a8e25929\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107061 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-ovsdbserver-sb-tls-certs\") pod \"590d851e-4648-48db-b385-aaa732f5c787\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107083 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdb-rundir\") pod \"235930b1-672c-4fc6-bbb4-78204c591aee\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107102 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"235930b1-672c-4fc6-bbb4-78204c591aee\" (UID: \"235930b1-672c-4fc6-bbb4-78204c591aee\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107137 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-ovn-controller-tls-certs\") pod \"cc808a1a-9703-4009-8d81-e555a8e25929\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107151 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-config\") pod \"590d851e-4648-48db-b385-aaa732f5c787\" (UID: \"590d851e-4648-48db-b385-aaa732f5c787\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107174 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-combined-ca-bundle\") pod \"cc808a1a-9703-4009-8d81-e555a8e25929\" (UID: \"cc808a1a-9703-4009-8d81-e555a8e25929\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107213 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbfft\" (UniqueName: \"kubernetes.io/projected/845a93a1-bf5f-4820-a580-e01d2ed59416-kube-api-access-rbfft\") pod \"845a93a1-bf5f-4820-a580-e01d2ed59416\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.107230 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-combined-ca-bundle\") pod \"845a93a1-bf5f-4820-a580-e01d2ed59416\" (UID: \"845a93a1-bf5f-4820-a580-e01d2ed59416\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.125219 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590d851e-4648-48db-b385-aaa732f5c787-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "590d851e-4648-48db-b385-aaa732f5c787" (UID: "590d851e-4648-48db-b385-aaa732f5c787"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.127770 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run" (OuterVolumeSpecName: "var-run") pod "cc808a1a-9703-4009-8d81-e555a8e25929" (UID: "cc808a1a-9703-4009-8d81-e555a8e25929"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.133538 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-scripts" (OuterVolumeSpecName: "scripts") pod "590d851e-4648-48db-b385-aaa732f5c787" (UID: "590d851e-4648-48db-b385-aaa732f5c787"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.134589 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc808a1a-9703-4009-8d81-e555a8e25929-scripts" (OuterVolumeSpecName: "scripts") pod "cc808a1a-9703-4009-8d81-e555a8e25929" (UID: "cc808a1a-9703-4009-8d81-e555a8e25929"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.139805 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-scripts" (OuterVolumeSpecName: "scripts") pod "235930b1-672c-4fc6-bbb4-78204c591aee" (UID: "235930b1-672c-4fc6-bbb4-78204c591aee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.144946 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "cc808a1a-9703-4009-8d81-e555a8e25929" (UID: "cc808a1a-9703-4009-8d81-e555a8e25929"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.147934 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "235930b1-672c-4fc6-bbb4-78204c591aee" (UID: "235930b1-672c-4fc6-bbb4-78204c591aee"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.153713 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "cc808a1a-9703-4009-8d81-e555a8e25929" (UID: "cc808a1a-9703-4009-8d81-e555a8e25929"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.158325 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4c02-account-delete-z4mhj"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.159057 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-config" (OuterVolumeSpecName: "config") pod "590d851e-4648-48db-b385-aaa732f5c787" (UID: "590d851e-4648-48db-b385-aaa732f5c787"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.159578 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc808a1a-9703-4009-8d81-e555a8e25929-kube-api-access-pfqlx" (OuterVolumeSpecName: "kube-api-access-pfqlx") pod "cc808a1a-9703-4009-8d81-e555a8e25929" (UID: "cc808a1a-9703-4009-8d81-e555a8e25929"). InnerVolumeSpecName "kube-api-access-pfqlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.159644 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590d851e-4648-48db-b385-aaa732f5c787-kube-api-access-6hkkr" (OuterVolumeSpecName: "kube-api-access-6hkkr") pod "590d851e-4648-48db-b385-aaa732f5c787" (UID: "590d851e-4648-48db-b385-aaa732f5c787"). InnerVolumeSpecName "kube-api-access-6hkkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.159988 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-config" (OuterVolumeSpecName: "config") pod "235930b1-672c-4fc6-bbb4-78204c591aee" (UID: "235930b1-672c-4fc6-bbb4-78204c591aee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.160825 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "590d851e-4648-48db-b385-aaa732f5c787" (UID: "590d851e-4648-48db-b385-aaa732f5c787"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.172396 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-kube-api-access-8tqml" (OuterVolumeSpecName: "kube-api-access-8tqml") pod "bfa3814f-a0f4-4d53-9c08-44d7b45dd662" (UID: "bfa3814f-a0f4-4d53-9c08-44d7b45dd662"). InnerVolumeSpecName "kube-api-access-8tqml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.173081 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235930b1-672c-4fc6-bbb4-78204c591aee-kube-api-access-jn85c" (OuterVolumeSpecName: "kube-api-access-jn85c") pod "235930b1-672c-4fc6-bbb4-78204c591aee" (UID: "235930b1-672c-4fc6-bbb4-78204c591aee"). InnerVolumeSpecName "kube-api-access-jn85c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.173161 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845a93a1-bf5f-4820-a580-e01d2ed59416-kube-api-access-rbfft" (OuterVolumeSpecName: "kube-api-access-rbfft") pod "845a93a1-bf5f-4820-a580-e01d2ed59416" (UID: "845a93a1-bf5f-4820-a580-e01d2ed59416"). InnerVolumeSpecName "kube-api-access-rbfft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.194077 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "235930b1-672c-4fc6-bbb4-78204c591aee" (UID: "235930b1-672c-4fc6-bbb4-78204c591aee"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209618 4750 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209659 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfqlx\" (UniqueName: \"kubernetes.io/projected/cc808a1a-9703-4009-8d81-e555a8e25929-kube-api-access-pfqlx\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209672 4750 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209682 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209693 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tqml\" (UniqueName: \"kubernetes.io/projected/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-kube-api-access-8tqml\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209707 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hkkr\" (UniqueName: \"kubernetes.io/projected/590d851e-4648-48db-b385-aaa732f5c787-kube-api-access-6hkkr\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209718 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209730 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn85c\" (UniqueName: \"kubernetes.io/projected/235930b1-672c-4fc6-bbb4-78204c591aee-kube-api-access-jn85c\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209741 4750 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc808a1a-9703-4009-8d81-e555a8e25929-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209752 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209762 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc808a1a-9703-4009-8d81-e555a8e25929-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209788 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209799 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590d851e-4648-48db-b385-aaa732f5c787-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209810 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbfft\" (UniqueName: \"kubernetes.io/projected/845a93a1-bf5f-4820-a580-e01d2ed59416-kube-api-access-rbfft\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209822 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/590d851e-4648-48db-b385-aaa732f5c787-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209840 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.209853 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235930b1-672c-4fc6-bbb4-78204c591aee-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.258709 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "845a93a1-bf5f-4820-a580-e01d2ed59416" (UID: "845a93a1-bf5f-4820-a580-e01d2ed59416"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: E1008 18:33:08.314600 4750 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.314618 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: E1008 18:33:08.314695 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data podName:af09729b-3284-4dcd-91a1-5763d28daaf5 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:09.314674016 +0000 UTC m=+1345.227645029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data") pod "nova-cell1-conductor-0" (UID: "af09729b-3284-4dcd-91a1-5763d28daaf5") : secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.335609 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "590d851e-4648-48db-b385-aaa732f5c787" (UID: "590d851e-4648-48db-b385-aaa732f5c787"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.337222 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bfa3814f-a0f4-4d53-9c08-44d7b45dd662" (UID: "bfa3814f-a0f4-4d53-9c08-44d7b45dd662"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.342019 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.345346 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.349857 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bfa3814f-a0f4-4d53-9c08-44d7b45dd662" (UID: "bfa3814f-a0f4-4d53-9c08-44d7b45dd662"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.385229 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bfa3814f-a0f4-4d53-9c08-44d7b45dd662" (UID: "bfa3814f-a0f4-4d53-9c08-44d7b45dd662"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.387670 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "845a93a1-bf5f-4820-a580-e01d2ed59416" (UID: "845a93a1-bf5f-4820-a580-e01d2ed59416"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.416023 4750 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.416055 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.416065 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.416072 4750 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.416082 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.416090 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.416098 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.422773 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "235930b1-672c-4fc6-bbb4-78204c591aee" (UID: "235930b1-672c-4fc6-bbb4-78204c591aee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.467232 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc808a1a-9703-4009-8d81-e555a8e25929" (UID: "cc808a1a-9703-4009-8d81-e555a8e25929"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.517690 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.517715 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.521813 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.605694 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican5ee7-account-delete-nlrfw"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.618004 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement9f1c-account-delete-wx96m"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.620616 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-run-httpd\") pod \"c218b865-c7d1-4f46-ad6d-8e102b6af491\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.620724 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-public-tls-certs\") pod \"c218b865-c7d1-4f46-ad6d-8e102b6af491\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.620786 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-log-httpd\") pod \"c218b865-c7d1-4f46-ad6d-8e102b6af491\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.621005 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-etc-swift\") pod \"c218b865-c7d1-4f46-ad6d-8e102b6af491\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.621050 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-config-data\") pod \"c218b865-c7d1-4f46-ad6d-8e102b6af491\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.621080 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqfql\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-kube-api-access-wqfql\") pod \"c218b865-c7d1-4f46-ad6d-8e102b6af491\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.621108 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-combined-ca-bundle\") pod \"c218b865-c7d1-4f46-ad6d-8e102b6af491\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.621161 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-internal-tls-certs\") pod \"c218b865-c7d1-4f46-ad6d-8e102b6af491\" (UID: \"c218b865-c7d1-4f46-ad6d-8e102b6af491\") " Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.621347 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapid8db-account-delete-7phjk"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.623597 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c218b865-c7d1-4f46-ad6d-8e102b6af491" (UID: "c218b865-c7d1-4f46-ad6d-8e102b6af491"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.631840 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c218b865-c7d1-4f46-ad6d-8e102b6af491" (UID: "c218b865-c7d1-4f46-ad6d-8e102b6af491"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.643635 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder7a82-account-delete-rvqnc"] Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.650157 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c218b865-c7d1-4f46-ad6d-8e102b6af491" (UID: "c218b865-c7d1-4f46-ad6d-8e102b6af491"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.660794 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-kube-api-access-wqfql" (OuterVolumeSpecName: "kube-api-access-wqfql") pod "c218b865-c7d1-4f46-ad6d-8e102b6af491" (UID: "c218b865-c7d1-4f46-ad6d-8e102b6af491"). InnerVolumeSpecName "kube-api-access-wqfql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.678719 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0c01-account-delete-rfw7g"] Oct 08 18:33:08 crc kubenswrapper[4750]: W1008 18:33:08.698167 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c9b7a5e_9efd_45f8_86bf_730f55c077fd.slice/crio-48e5b20f57420651b9fd8b00fc147ab6612791bccf43f704cb157d1592d1f311 WatchSource:0}: Error finding container 48e5b20f57420651b9fd8b00fc147ab6612791bccf43f704cb157d1592d1f311: Status 404 returned error can't find the container with id 48e5b20f57420651b9fd8b00fc147ab6612791bccf43f704cb157d1592d1f311 Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.703958 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfa3814f-a0f4-4d53-9c08-44d7b45dd662" (UID: "bfa3814f-a0f4-4d53-9c08-44d7b45dd662"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.724110 4750 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.724139 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqfql\" (UniqueName: \"kubernetes.io/projected/c218b865-c7d1-4f46-ad6d-8e102b6af491-kube-api-access-wqfql\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.724148 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.724156 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.724164 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c218b865-c7d1-4f46-ad6d-8e102b6af491-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.733695 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-config" (OuterVolumeSpecName: "config") pod "bfa3814f-a0f4-4d53-9c08-44d7b45dd662" (UID: "bfa3814f-a0f4-4d53-9c08-44d7b45dd662"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.770352 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099005c9-52af-4489-9103-d6c82b1c82b2" path="/var/lib/kubelet/pods/099005c9-52af-4489-9103-d6c82b1c82b2/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.777661 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154374c5-fc90-40da-9ac7-a98f99aca0a1" path="/var/lib/kubelet/pods/154374c5-fc90-40da-9ac7-a98f99aca0a1/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.779129 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f613fe6-8980-4ded-8c2f-c4222c597cf1" path="/var/lib/kubelet/pods/1f613fe6-8980-4ded-8c2f-c4222c597cf1/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.783932 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="341402ad-8d92-4e75-825c-1849b5f99f4d" path="/var/lib/kubelet/pods/341402ad-8d92-4e75-825c-1849b5f99f4d/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.785376 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab07eac-5578-43a8-979b-d3dba99ce3ba" path="/var/lib/kubelet/pods/3ab07eac-5578-43a8-979b-d3dba99ce3ba/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.785884 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fff367e-d784-48f0-ad74-571e1587abbb" path="/var/lib/kubelet/pods/3fff367e-d784-48f0-ad74-571e1587abbb/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.786920 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41591e4c-def9-4152-8bec-7c47ed4367e8" path="/var/lib/kubelet/pods/41591e4c-def9-4152-8bec-7c47ed4367e8/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.787531 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5253a72b-fa57-4150-92f0-0d6172aca7f0" path="/var/lib/kubelet/pods/5253a72b-fa57-4150-92f0-0d6172aca7f0/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.788189 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65af30d1-7ae5-485a-85d4-271a4642c2cf" path="/var/lib/kubelet/pods/65af30d1-7ae5-485a-85d4-271a4642c2cf/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.788886 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d622c4-c0ae-4cd3-bdab-101eb0783cc3" path="/var/lib/kubelet/pods/65d622c4-c0ae-4cd3-bdab-101eb0783cc3/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.790085 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c616741-8a43-456e-a249-aee7e4d3764f" path="/var/lib/kubelet/pods/6c616741-8a43-456e-a249-aee7e4d3764f/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.790807 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3207de-78eb-41b2-a2be-163c9a3532af" path="/var/lib/kubelet/pods/7c3207de-78eb-41b2-a2be-163c9a3532af/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.791574 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb0941b-83e9-4afb-87ed-a26326dfd400" path="/var/lib/kubelet/pods/8bb0941b-83e9-4afb-87ed-a26326dfd400/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.792528 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da1090b-4086-4e84-b252-982a0f876031" path="/var/lib/kubelet/pods/8da1090b-4086-4e84-b252-982a0f876031/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.793362 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a80f86-2304-4b69-9cc8-8ef39c25c999" path="/var/lib/kubelet/pods/c3a80f86-2304-4b69-9cc8-8ef39c25c999/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.794071 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3df91c-8c7d-4adc-be58-34b388a95c93" path="/var/lib/kubelet/pods/ce3df91c-8c7d-4adc-be58-34b388a95c93/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.794661 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3678610-ef6c-49b3-a2e5-73d5431c1c4d" path="/var/lib/kubelet/pods/d3678610-ef6c-49b3-a2e5-73d5431c1c4d/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.795880 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e094d330-b345-4b12-bf02-7e8d55307fce" path="/var/lib/kubelet/pods/e094d330-b345-4b12-bf02-7e8d55307fce/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.796512 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e175b5f7-2b09-423e-8492-7802da4e1ec1" path="/var/lib/kubelet/pods/e175b5f7-2b09-423e-8492-7802da4e1ec1/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.797193 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8a957d-821b-438f-9b31-aabc6a3672e0" path="/var/lib/kubelet/pods/fc8a957d-821b-438f-9b31-aabc6a3672e0/volumes" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.797960 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "845a93a1-bf5f-4820-a580-e01d2ed59416" (UID: "845a93a1-bf5f-4820-a580-e01d2ed59416"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.828189 4750 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/845a93a1-bf5f-4820-a580-e01d2ed59416-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.828227 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa3814f-a0f4-4d53-9c08-44d7b45dd662-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: E1008 18:33:08.828720 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:08 crc kubenswrapper[4750]: E1008 18:33:08.828783 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data podName:43a52313-747b-40a7-a7e0-9e18f3c97c42 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:12.828761443 +0000 UTC m=+1348.741732536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data") pod "rabbitmq-cell1-server-0" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42") : configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.848043 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "590d851e-4648-48db-b385-aaa732f5c787" (UID: "590d851e-4648-48db-b385-aaa732f5c787"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.866803 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "235930b1-672c-4fc6-bbb4-78204c591aee" (UID: "235930b1-672c-4fc6-bbb4-78204c591aee"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.904471 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "235930b1-672c-4fc6-bbb4-78204c591aee" (UID: "235930b1-672c-4fc6-bbb4-78204c591aee"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.939155 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.939193 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.939205 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/235930b1-672c-4fc6-bbb4-78204c591aee-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.949931 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "cc808a1a-9703-4009-8d81-e555a8e25929" (UID: "cc808a1a-9703-4009-8d81-e555a8e25929"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.964160 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "590d851e-4648-48db-b385-aaa732f5c787" (UID: "590d851e-4648-48db-b385-aaa732f5c787"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:08 crc kubenswrapper[4750]: I1008 18:33:08.993888 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-config-data" (OuterVolumeSpecName: "config-data") pod "c218b865-c7d1-4f46-ad6d-8e102b6af491" (UID: "c218b865-c7d1-4f46-ad6d-8e102b6af491"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.006676 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c218b865-c7d1-4f46-ad6d-8e102b6af491" (UID: "c218b865-c7d1-4f46-ad6d-8e102b6af491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.011516 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c218b865-c7d1-4f46-ad6d-8e102b6af491" (UID: "c218b865-c7d1-4f46-ad6d-8e102b6af491"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.024691 4750 generic.go:334] "Generic (PLEG): container finished" podID="e6709646-0141-474b-b73f-6f451e77f602" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" exitCode=0 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.038543 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.041760 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.041865 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/590d851e-4648-48db-b385-aaa732f5c787-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.041877 4750 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc808a1a-9703-4009-8d81-e555a8e25929-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.041886 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.041895 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.059605 4750 generic.go:334] "Generic (PLEG): container finished" podID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerID="0feb4ca16b6758c6207a2a94aa09e8a8463ace7881baffb89c3d90fa392320bd" exitCode=143 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.064135 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c218b865-c7d1-4f46-ad6d-8e102b6af491" (UID: "c218b865-c7d1-4f46-ad6d-8e102b6af491"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.065375 4750 generic.go:334] "Generic (PLEG): container finished" podID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerID="cf0d886f93ff577bb445d8aa80c9cd9e710f2e6e45c5f75d8333a920295edfa7" exitCode=143 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.072823 4750 generic.go:334] "Generic (PLEG): container finished" podID="c4cbc20b-7898-4a47-99f6-80436897042c" containerID="3f09bf21e5c4d3c261d96fe4ae9e3aa7f61c0a08319b33de766658446000ec50" exitCode=0 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.083525 4750 generic.go:334] "Generic (PLEG): container finished" podID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerID="726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61" exitCode=0 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.083566 4750 generic.go:334] "Generic (PLEG): container finished" podID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerID="0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f" exitCode=0 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.083654 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74b95c857c-677wg" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.100923 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_235930b1-672c-4fc6-bbb4-78204c591aee/ovsdbserver-nb/0.log" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.101052 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.107168 4750 generic.go:334] "Generic (PLEG): container finished" podID="2feb2439-d911-4585-a5e1-671abcfa357d" containerID="131e51f9cac5dc8a9c0d425746e3b9ee7a9d53950bcb5870bac43328508a3398" exitCode=0 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.133114 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_590d851e-4648-48db-b385-aaa732f5c787/ovsdbserver-sb/0.log" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.133264 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.138560 4750 generic.go:334] "Generic (PLEG): container finished" podID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerID="f8c62ba9b4feac7a06f75a6ce1e507df508740e668c07ed3f788844322a15de7" exitCode=143 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.139903 4750 generic.go:334] "Generic (PLEG): container finished" podID="28550569-4c3c-48cf-a621-eddec0919b51" containerID="3dac308349d8e3040066022cf2273a5c40cf9dfc274269fe30212fe125c4aad8" exitCode=143 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.143168 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c218b865-c7d1-4f46-ad6d-8e102b6af491-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.145513 4750 generic.go:334] "Generic (PLEG): container finished" podID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerID="653d2c90232b23be484b8fc7b55378350753180183fba74f1dbd9d6ac1dde268" exitCode=143 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.148265 4750 generic.go:334] "Generic (PLEG): container finished" podID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerID="be50775e9b6d25d8067aa8220a05db2eb7a3fed17a5981e8b008927ca24a65a9" exitCode=0 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.166485 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="0daa177c260aa7c024a5646c1d578e111a41bb913177c6089099d8076dcc1b13" exitCode=0 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.176588 4750 generic.go:334] "Generic (PLEG): container finished" podID="b173b167-1fa4-45ec-98d0-16956f4b0b30" containerID="818a7ca04219bc24f3b1bb2f35ec42283d410173dbbfcfda72a5ebabfcf269d4" exitCode=0 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.181211 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="af09729b-3284-4dcd-91a1-5763d28daaf5" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" gracePeriod=30 Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.181445 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4d96bb9-84nq2" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.184847 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mkxdr" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185681 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-57vgx" event={"ID":"e6709646-0141-474b-b73f-6f451e77f602","Type":"ContainerDied","Data":"afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185717 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement9f1c-account-delete-wx96m" event={"ID":"0c9b7a5e-9efd-45f8-86bf-730f55c077fd","Type":"ContainerStarted","Data":"48e5b20f57420651b9fd8b00fc147ab6612791bccf43f704cb157d1592d1f311"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185731 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e027e860-d0c0-4b1b-b02b-c374d92ae115","Type":"ContainerDied","Data":"0feb4ca16b6758c6207a2a94aa09e8a8463ace7881baffb89c3d90fa392320bd"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185749 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2aacea2e-e630-4280-8bed-b3b13b67f8ae","Type":"ContainerDied","Data":"cf0d886f93ff577bb445d8aa80c9cd9e710f2e6e45c5f75d8333a920295edfa7"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185759 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c4cbc20b-7898-4a47-99f6-80436897042c","Type":"ContainerDied","Data":"3f09bf21e5c4d3c261d96fe4ae9e3aa7f61c0a08319b33de766658446000ec50"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185769 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74b95c857c-677wg" event={"ID":"c218b865-c7d1-4f46-ad6d-8e102b6af491","Type":"ContainerDied","Data":"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185781 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74b95c857c-677wg" event={"ID":"c218b865-c7d1-4f46-ad6d-8e102b6af491","Type":"ContainerDied","Data":"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185791 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74b95c857c-677wg" event={"ID":"c218b865-c7d1-4f46-ad6d-8e102b6af491","Type":"ContainerDied","Data":"8a607a1ae0f7352862e8ea719dfdd3057a21089276a0355a348ce53dcaba80ef"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185800 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"235930b1-672c-4fc6-bbb4-78204c591aee","Type":"ContainerDied","Data":"e3299d677ec95752fdeedd1f2933d796db4410437bbff6b75df7cd49984a5ba6"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185811 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2feb2439-d911-4585-a5e1-671abcfa357d","Type":"ContainerDied","Data":"131e51f9cac5dc8a9c0d425746e3b9ee7a9d53950bcb5870bac43328508a3398"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185821 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"590d851e-4648-48db-b385-aaa732f5c787","Type":"ContainerDied","Data":"b2f0fc66c91c7f6856410a2b65f196da6d53142b0da13deab6cd17b27c2b1da2"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185833 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" event={"ID":"9ecf0d73-0ca5-4124-93fb-348f8769c2e2","Type":"ContainerDied","Data":"f8c62ba9b4feac7a06f75a6ce1e507df508740e668c07ed3f788844322a15de7"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185846 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-587d5f7b59-ws4tc" event={"ID":"28550569-4c3c-48cf-a621-eddec0919b51","Type":"ContainerDied","Data":"3dac308349d8e3040066022cf2273a5c40cf9dfc274269fe30212fe125c4aad8"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185856 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-594d9fc688-28msd" event={"ID":"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0","Type":"ContainerDied","Data":"653d2c90232b23be484b8fc7b55378350753180183fba74f1dbd9d6ac1dde268"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185866 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0c01-account-delete-rfw7g" event={"ID":"af33a9f0-c575-46f6-a3cd-71391d454430","Type":"ContainerStarted","Data":"29cf139a21eea17f2e0c5a38a10b09b0a29ebbd85ea7b5109ef103d045beb659"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185876 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60379ea9-0750-4de0-9d3b-13af080eea8f","Type":"ContainerDied","Data":"be50775e9b6d25d8067aa8220a05db2eb7a3fed17a5981e8b008927ca24a65a9"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185886 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder7a82-account-delete-rvqnc" event={"ID":"04d7f724-d53d-4412-bce7-cc6da81e45ac","Type":"ContainerStarted","Data":"f1fe6211f671a872c18c03a7ae507f2373dd61e4bc88fa79ba64d6506d82fd0e"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185896 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"0daa177c260aa7c024a5646c1d578e111a41bb913177c6089099d8076dcc1b13"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185907 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4c02-account-delete-z4mhj" event={"ID":"b173b167-1fa4-45ec-98d0-16956f4b0b30","Type":"ContainerDied","Data":"818a7ca04219bc24f3b1bb2f35ec42283d410173dbbfcfda72a5ebabfcf269d4"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185919 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4c02-account-delete-z4mhj" event={"ID":"b173b167-1fa4-45ec-98d0-16956f4b0b30","Type":"ContainerStarted","Data":"c40bacc677dec44ee1d4a5bd05de6adcc44b63265685561d7832ba861b58aa98"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185930 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5ee7-account-delete-nlrfw" event={"ID":"c751481b-5934-4262-84ff-106498a453e0","Type":"ContainerStarted","Data":"baf55c6163e79ccd097fdd653a613bee7d66e76de7dce406de8234909b311774"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185940 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapid8db-account-delete-7phjk" event={"ID":"736a9e30-a3de-4e9f-9de7-52015e55443e","Type":"ContainerStarted","Data":"22627f58aaa5032d1b5fab055a1b6d9fe1303fe002bfeae8b3341420721db510"} Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.185956 4750 scope.go:117] "RemoveContainer" containerID="be3fa37eeeee66c52135106d0b32e1df0c8a601b9cfff61c6d2080884d388f7a" Oct 08 18:33:09 crc kubenswrapper[4750]: E1008 18:33:09.346187 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 18:33:09 crc kubenswrapper[4750]: E1008 18:33:09.346269 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data podName:5b8108eb-834c-44bd-9f39-70c348388ab6 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:13.346253103 +0000 UTC m=+1349.259224116 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data") pod "rabbitmq-server-0" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6") : configmap "rabbitmq-config-data" not found Oct 08 18:33:09 crc kubenswrapper[4750]: E1008 18:33:09.346300 4750 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:09 crc kubenswrapper[4750]: E1008 18:33:09.346414 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data podName:af09729b-3284-4dcd-91a1-5763d28daaf5 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:11.346393947 +0000 UTC m=+1347.259364960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data") pod "nova-cell1-conductor-0" (UID: "af09729b-3284-4dcd-91a1-5763d28daaf5") : secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.444562 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.470792 4750 scope.go:117] "RemoveContainer" containerID="726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.559180 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-config-data\") pod \"c4cbc20b-7898-4a47-99f6-80436897042c\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.559282 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsjz2\" (UniqueName: \"kubernetes.io/projected/c4cbc20b-7898-4a47-99f6-80436897042c-kube-api-access-vsjz2\") pod \"c4cbc20b-7898-4a47-99f6-80436897042c\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.559329 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-combined-ca-bundle\") pod \"c4cbc20b-7898-4a47-99f6-80436897042c\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.559396 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-vencrypt-tls-certs\") pod \"c4cbc20b-7898-4a47-99f6-80436897042c\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.559433 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-nova-novncproxy-tls-certs\") pod \"c4cbc20b-7898-4a47-99f6-80436897042c\" (UID: \"c4cbc20b-7898-4a47-99f6-80436897042c\") " Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.570717 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4cbc20b-7898-4a47-99f6-80436897042c-kube-api-access-vsjz2" (OuterVolumeSpecName: "kube-api-access-vsjz2") pod "c4cbc20b-7898-4a47-99f6-80436897042c" (UID: "c4cbc20b-7898-4a47-99f6-80436897042c"). InnerVolumeSpecName "kube-api-access-vsjz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.662352 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsjz2\" (UniqueName: \"kubernetes.io/projected/c4cbc20b-7898-4a47-99f6-80436897042c-kube-api-access-vsjz2\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.663668 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4cbc20b-7898-4a47-99f6-80436897042c" (UID: "c4cbc20b-7898-4a47-99f6-80436897042c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.686921 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "c4cbc20b-7898-4a47-99f6-80436897042c" (UID: "c4cbc20b-7898-4a47-99f6-80436897042c"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.691693 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-config-data" (OuterVolumeSpecName: "config-data") pod "c4cbc20b-7898-4a47-99f6-80436897042c" (UID: "c4cbc20b-7898-4a47-99f6-80436897042c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:09 crc kubenswrapper[4750]: I1008 18:33:09.731722 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "c4cbc20b-7898-4a47-99f6-80436897042c" (UID: "c4cbc20b-7898-4a47-99f6-80436897042c"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:09.768167 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:09.768187 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:09.768197 4750 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:09.768206 4750 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4cbc20b-7898-4a47-99f6-80436897042c-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:09.869287 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:09.873820 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:09.884132 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:09.884230 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="af09729b-3284-4dcd-91a1-5763d28daaf5" containerName="nova-cell1-conductor-conductor" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:09.981993 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.116623 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.117247 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="ceilometer-central-agent" containerID="cri-o://225967b62a0a3c9db87ba69878ef83c98fd4d002b6b485a3ba2e44f7c9932962" gracePeriod=30 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.117655 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="proxy-httpd" containerID="cri-o://5eedb186a864de7ffc8ffb0c7d12aa7cbfd51fbb9c7ce15f42d2611d1dc2df3a" gracePeriod=30 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.117712 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="sg-core" containerID="cri-o://ef0cea034999dc054ed498ebdf790b29ef4af6c72ed138ea7f570a475564630d" gracePeriod=30 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.117745 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="ceilometer-notification-agent" containerID="cri-o://19ff8cd5ce4ea2ca5a48e87f8b09c6d1da6004603a55895cb9163299bb16a295" gracePeriod=30 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.152790 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.152980 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" containerName="kube-state-metrics" containerID="cri-o://2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e" gracePeriod=30 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.171266 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.182:8776/healthcheck\": read tcp 10.217.0.2:35532->10.217.0.182:8776: read: connection reset by peer" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.213179 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4c02-account-delete-z4mhj" event={"ID":"b173b167-1fa4-45ec-98d0-16956f4b0b30","Type":"ContainerDied","Data":"c40bacc677dec44ee1d4a5bd05de6adcc44b63265685561d7832ba861b58aa98"} Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.213213 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c40bacc677dec44ee1d4a5bd05de6adcc44b63265685561d7832ba861b58aa98" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.215118 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c4cbc20b-7898-4a47-99f6-80436897042c","Type":"ContainerDied","Data":"eb6209054bb93711f8a78b8668dce20ee628e400ffe428c429f8a1d1ba06bff3"} Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.215182 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.229077 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788b97745d-6snpn" event={"ID":"ec1950dc-6caf-45f7-9b18-8c12db1b3f25","Type":"ContainerDied","Data":"f46e6c1f5784d1ac90a72cce013512954360b873208916edc44bc5574486392e"} Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.229204 4750 generic.go:334] "Generic (PLEG): container finished" podID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerID="f46e6c1f5784d1ac90a72cce013512954360b873208916edc44bc5574486392e" exitCode=0 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.235837 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2feb2439-d911-4585-a5e1-671abcfa357d","Type":"ContainerDied","Data":"7c86a532bcb761c12dc472052ed8995adaafbe37a87f507cc02f082ef77bb416"} Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.235869 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c86a532bcb761c12dc472052ed8995adaafbe37a87f507cc02f082ef77bb416" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.239387 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.239676 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="ed7b2661-13bd-4ab4-a92d-7bc382cd257e" containerName="memcached" containerID="cri-o://c099682680f99d7d836a4647a357e3b0557699173c713817a786fb3c6c7a1d59" gracePeriod=30 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.306484 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.174:9292/healthcheck\": read tcp 10.217.0.2:36426->10.217.0.174:9292: read: connection reset by peer" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.306607 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.174:9292/healthcheck\": read tcp 10.217.0.2:36424->10.217.0.174:9292: read: connection reset by peer" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.325607 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-d6q8c"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.349015 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-d6q8c"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.388816 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-w4fvs"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.422098 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-548c7c66b4-b72bl"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.422487 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-548c7c66b4-b72bl" podUID="2f22ab58-6189-4321-b660-ed992f6fb70f" containerName="keystone-api" containerID="cri-o://e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2" gracePeriod=30 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.434751 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-w4fvs"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.438866 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.554007 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.554521 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.567215 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.571811 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.574503 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.577937 4750 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.578985 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xrdld"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.581076 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.581152 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.590504 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xrdld"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.597662 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.599596 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.601202 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 18:33:10 crc kubenswrapper[4750]: E1008 18:33:10.601319 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="ovn-northd" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.603248 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2d53-account-create-n4mq6"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.610626 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2d53-account-create-n4mq6"] Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.757154 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerName="galera" containerID="cri-o://238e771ab2028f72d92800deb7a9dbb1d35bb3e07296801734338f1c8ed278bd" gracePeriod=30 Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.762082 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad0cb6f-78d3-48ea-943f-ef07e0b52886" path="/var/lib/kubelet/pods/3ad0cb6f-78d3-48ea-943f-ef07e0b52886/volumes" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.762914 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b55b3b-50b6-4ef2-81d2-801582731466" path="/var/lib/kubelet/pods/72b55b3b-50b6-4ef2-81d2-801582731466/volumes" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.764240 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="845a93a1-bf5f-4820-a580-e01d2ed59416" path="/var/lib/kubelet/pods/845a93a1-bf5f-4820-a580-e01d2ed59416/volumes" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.765053 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c121e821-4fcc-4ea5-9844-3fb345b952e9" path="/var/lib/kubelet/pods/c121e821-4fcc-4ea5-9844-3fb345b952e9/volumes" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.765855 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09de111-101a-4976-b327-2281bbc6b573" path="/var/lib/kubelet/pods/e09de111-101a-4976-b327-2281bbc6b573/volumes" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.937433 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Oct 08 18:33:10 crc kubenswrapper[4750]: I1008 18:33:10.963483 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.001762 4750 scope.go:117] "RemoveContainer" containerID="0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064134 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-galera-tls-certs\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064189 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064227 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-combined-ca-bundle\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064390 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-secrets\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064424 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsnsq\" (UniqueName: \"kubernetes.io/projected/2feb2439-d911-4585-a5e1-671abcfa357d-kube-api-access-vsnsq\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064459 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-operator-scripts\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064489 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-generated\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064510 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-default\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.064640 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-kolla-config\") pod \"2feb2439-d911-4585-a5e1-671abcfa357d\" (UID: \"2feb2439-d911-4585-a5e1-671abcfa357d\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.065812 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.068677 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.073012 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4c02-account-delete-z4mhj" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.075366 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.075601 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.087843 4750 scope.go:117] "RemoveContainer" containerID="726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61" Oct 08 18:33:11 crc kubenswrapper[4750]: E1008 18:33:11.088343 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61\": container with ID starting with 726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61 not found: ID does not exist" containerID="726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.088382 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61"} err="failed to get container status \"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61\": rpc error: code = NotFound desc = could not find container \"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61\": container with ID starting with 726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61 not found: ID does not exist" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.088413 4750 scope.go:117] "RemoveContainer" containerID="0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f" Oct 08 18:33:11 crc kubenswrapper[4750]: E1008 18:33:11.089856 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f\": container with ID starting with 0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f not found: ID does not exist" containerID="0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.089890 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f"} err="failed to get container status \"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f\": rpc error: code = NotFound desc = could not find container \"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f\": container with ID starting with 0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f not found: ID does not exist" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.089908 4750 scope.go:117] "RemoveContainer" containerID="726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.090151 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61"} err="failed to get container status \"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61\": rpc error: code = NotFound desc = could not find container \"726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61\": container with ID starting with 726de4b1d2b400cc335f7c66a47cd7bc83547d6f298c85f9ac80978cd6e0ca61 not found: ID does not exist" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.090174 4750 scope.go:117] "RemoveContainer" containerID="0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.090436 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f"} err="failed to get container status \"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f\": rpc error: code = NotFound desc = could not find container \"0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f\": container with ID starting with 0d6c9db01902a53d96492abe2fa964b6a118c8bcd236fbfdaca40d28257dc94f not found: ID does not exist" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.090459 4750 scope.go:117] "RemoveContainer" containerID="d8bac502dbc380032817c12bba89e9f6cc302b44cbc72b0455a5c8be03b7c626" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.110941 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-secrets" (OuterVolumeSpecName: "secrets") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.131199 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2feb2439-d911-4585-a5e1-671abcfa357d-kube-api-access-vsnsq" (OuterVolumeSpecName: "kube-api-access-vsnsq") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "kube-api-access-vsnsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.165528 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788b97745d-6snpn" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.165938 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.171686 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-84nq2"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.175666 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d4d96bb9-84nq2"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.188324 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hglk4\" (UniqueName: \"kubernetes.io/projected/b173b167-1fa4-45ec-98d0-16956f4b0b30-kube-api-access-hglk4\") pod \"b173b167-1fa4-45ec-98d0-16956f4b0b30\" (UID: \"b173b167-1fa4-45ec-98d0-16956f4b0b30\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.201635 4750 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.201767 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsnsq\" (UniqueName: \"kubernetes.io/projected/2feb2439-d911-4585-a5e1-671abcfa357d-kube-api-access-vsnsq\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.201828 4750 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.201928 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.201980 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.202029 4750 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2feb2439-d911-4585-a5e1-671abcfa357d-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.202102 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.202126 4750 scope.go:117] "RemoveContainer" containerID="e8ed3cdde57c4534decef125e57e81c8e9ddb189aeb9195d0d45761ded957615" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.205168 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.205890 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.250931 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.279994 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b173b167-1fa4-45ec-98d0-16956f4b0b30-kube-api-access-hglk4" (OuterVolumeSpecName: "kube-api-access-hglk4") pod "b173b167-1fa4-45ec-98d0-16956f4b0b30" (UID: "b173b167-1fa4-45ec-98d0-16956f4b0b30"). InnerVolumeSpecName "kube-api-access-hglk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.299269 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0c01-account-delete-rfw7g" event={"ID":"af33a9f0-c575-46f6-a3cd-71391d454430","Type":"ContainerStarted","Data":"8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.299394 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron0c01-account-delete-rfw7g" podUID="af33a9f0-c575-46f6-a3cd-71391d454430" containerName="mariadb-account-delete" containerID="cri-o://8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6" gracePeriod=30 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.299706 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303207 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303256 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-etc-machine-id\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303292 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-public-tls-certs\") pod \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303316 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-internal-tls-certs\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303347 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-logs\") pod \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303366 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data-custom\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303401 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5r2g\" (UniqueName: \"kubernetes.io/projected/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-kube-api-access-c5r2g\") pod \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303444 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-combined-ca-bundle\") pod \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303458 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-combined-ca-bundle\") pod \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303478 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-public-tls-certs\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303496 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbjzw\" (UniqueName: \"kubernetes.io/projected/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-api-access-mbjzw\") pod \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303523 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-combined-ca-bundle\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303537 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-logs\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303619 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-internal-tls-certs\") pod \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303640 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45fjp\" (UniqueName: \"kubernetes.io/projected/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-kube-api-access-45fjp\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303712 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-scripts\") pod \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303744 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-config\") pod \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303766 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-scripts\") pod \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\" (UID: \"f1b5ad2e-1ee1-4955-99c2-8daed456b21c\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303781 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-config-data\") pod \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.303798 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-certs\") pod \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\" (UID: \"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.304138 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hglk4\" (UniqueName: \"kubernetes.io/projected/b173b167-1fa4-45ec-98d0-16956f4b0b30-kube-api-access-hglk4\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.320041 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-scripts" (OuterVolumeSpecName: "scripts") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.324936 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.325695 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mkxdr"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.330328 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-logs" (OuterVolumeSpecName: "logs") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.333924 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mkxdr"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.341532 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-74b95c857c-677wg"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.348499 4750 generic.go:334] "Generic (PLEG): container finished" podID="fe7385d5-3c78-4238-96be-78392eddee4b" containerID="5eedb186a864de7ffc8ffb0c7d12aa7cbfd51fbb9c7ce15f42d2611d1dc2df3a" exitCode=0 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.348523 4750 generic.go:334] "Generic (PLEG): container finished" podID="fe7385d5-3c78-4238-96be-78392eddee4b" containerID="ef0cea034999dc054ed498ebdf790b29ef4af6c72ed138ea7f570a475564630d" exitCode=2 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.348529 4750 generic.go:334] "Generic (PLEG): container finished" podID="fe7385d5-3c78-4238-96be-78392eddee4b" containerID="225967b62a0a3c9db87ba69878ef83c98fd4d002b6b485a3ba2e44f7c9932962" exitCode=0 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.348603 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerDied","Data":"5eedb186a864de7ffc8ffb0c7d12aa7cbfd51fbb9c7ce15f42d2611d1dc2df3a"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.348627 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerDied","Data":"ef0cea034999dc054ed498ebdf790b29ef4af6c72ed138ea7f570a475564630d"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.348639 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerDied","Data":"225967b62a0a3c9db87ba69878ef83c98fd4d002b6b485a3ba2e44f7c9932962"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.348674 4750 scope.go:117] "RemoveContainer" containerID="e646c1e4d2aa9c18afad260eee7d15c40c5f3fc3f863380210aa6106334f6c32" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.352989 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-logs" (OuterVolumeSpecName: "logs") pod "ec1950dc-6caf-45f7-9b18-8c12db1b3f25" (UID: "ec1950dc-6caf-45f7-9b18-8c12db1b3f25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.353045 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-74b95c857c-677wg"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.359780 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:55594->10.217.0.206:8775: read: connection reset by peer" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.359891 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:55606->10.217.0.206:8775: read: connection reset by peer" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.363755 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.371158 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.375151 4750 generic.go:334] "Generic (PLEG): container finished" podID="ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" containerID="2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e" exitCode=2 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.375197 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f","Type":"ContainerDied","Data":"2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.375222 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f","Type":"ContainerDied","Data":"831ad7f019599efb4d7fd03244f36bc634ee3d8556b35361f0d0cf6685fe1016"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.375261 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.377777 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.379598 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-kube-api-access-45fjp" (OuterVolumeSpecName: "kube-api-access-45fjp") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "kube-api-access-45fjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: E1008 18:33:11.387450 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="295b9725591cb856aaa6df5c90337486a2c9085eabdd2a6955b57c6bb2f78110" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.397370 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.402767 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.407810 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.407831 4750 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.407840 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.407848 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.407856 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45fjp\" (UniqueName: \"kubernetes.io/projected/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-kube-api-access-45fjp\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: E1008 18:33:11.407932 4750 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:11 crc kubenswrapper[4750]: E1008 18:33:11.407987 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data podName:af09729b-3284-4dcd-91a1-5763d28daaf5 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:15.407956051 +0000 UTC m=+1351.320927064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data") pod "nova-cell1-conductor-0" (UID: "af09729b-3284-4dcd-91a1-5763d28daaf5") : secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:11 crc kubenswrapper[4750]: E1008 18:33:11.408561 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="295b9725591cb856aaa6df5c90337486a2c9085eabdd2a6955b57c6bb2f78110" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.425572 4750 generic.go:334] "Generic (PLEG): container finished" podID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerID="e8cbb35efffa0dfc5317921463dc9d7e30007f7e471dd1e64c7dfbb694f8a5d9" exitCode=0 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.425657 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a25ebe44-c330-48f8-9df7-5f8517cd96bd","Type":"ContainerDied","Data":"e8cbb35efffa0dfc5317921463dc9d7e30007f7e471dd1e64c7dfbb694f8a5d9"} Oct 08 18:33:11 crc kubenswrapper[4750]: E1008 18:33:11.425697 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="295b9725591cb856aaa6df5c90337486a2c9085eabdd2a6955b57c6bb2f78110" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 18:33:11 crc kubenswrapper[4750]: E1008 18:33:11.425735 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dba17973-3023-43ae-9b75-a8e1dc7f16cc" containerName="nova-scheduler-scheduler" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.438725 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron0c01-account-delete-rfw7g" podStartSLOduration=6.438707342 podStartE2EDuration="6.438707342s" podCreationTimestamp="2025-10-08 18:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:33:11.387629134 +0000 UTC m=+1347.300600147" watchObservedRunningTime="2025-10-08 18:33:11.438707342 +0000 UTC m=+1347.351678375" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.454127 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.470575 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder7a82-account-delete-rvqnc" event={"ID":"04d7f724-d53d-4412-bce7-cc6da81e45ac","Type":"ContainerStarted","Data":"1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.470714 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder7a82-account-delete-rvqnc" podUID="04d7f724-d53d-4412-bce7-cc6da81e45ac" containerName="mariadb-account-delete" containerID="cri-o://1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc" gracePeriod=30 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.478249 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-scripts" (OuterVolumeSpecName: "scripts") pod "ec1950dc-6caf-45f7-9b18-8c12db1b3f25" (UID: "ec1950dc-6caf-45f7-9b18-8c12db1b3f25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.485928 4750 scope.go:117] "RemoveContainer" containerID="74257ca800c53f0c036fde4ead85c592a66b5ea7f444603ad7cbb66458ec4330" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.494042 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-kube-api-access-c5r2g" (OuterVolumeSpecName: "kube-api-access-c5r2g") pod "ec1950dc-6caf-45f7-9b18-8c12db1b3f25" (UID: "ec1950dc-6caf-45f7-9b18-8c12db1b3f25"). InnerVolumeSpecName "kube-api-access-c5r2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513141 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-api-access-mbjzw" (OuterVolumeSpecName: "kube-api-access-mbjzw") pod "ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" (UID: "ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f"). InnerVolumeSpecName "kube-api-access-mbjzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513480 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-combined-ca-bundle\") pod \"899027b7-067b-4ce1-a8f1-deaee627aa51\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513528 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-logs\") pod \"899027b7-067b-4ce1-a8f1-deaee627aa51\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513565 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-config-data\") pod \"899027b7-067b-4ce1-a8f1-deaee627aa51\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513601 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-httpd-run\") pod \"899027b7-067b-4ce1-a8f1-deaee627aa51\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513637 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-internal-tls-certs\") pod \"899027b7-067b-4ce1-a8f1-deaee627aa51\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513679 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-scripts\") pod \"899027b7-067b-4ce1-a8f1-deaee627aa51\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513782 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxdmc\" (UniqueName: \"kubernetes.io/projected/899027b7-067b-4ce1-a8f1-deaee627aa51-kube-api-access-sxdmc\") pod \"899027b7-067b-4ce1-a8f1-deaee627aa51\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.513806 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"899027b7-067b-4ce1-a8f1-deaee627aa51\" (UID: \"899027b7-067b-4ce1-a8f1-deaee627aa51\") " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.514966 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-logs" (OuterVolumeSpecName: "logs") pod "899027b7-067b-4ce1-a8f1-deaee627aa51" (UID: "899027b7-067b-4ce1-a8f1-deaee627aa51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.515295 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "899027b7-067b-4ce1-a8f1-deaee627aa51" (UID: "899027b7-067b-4ce1-a8f1-deaee627aa51"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.521267 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder7a82-account-delete-rvqnc" podStartSLOduration=6.52125311 podStartE2EDuration="6.52125311s" podCreationTimestamp="2025-10-08 18:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:33:11.515118221 +0000 UTC m=+1347.428089234" watchObservedRunningTime="2025-10-08 18:33:11.52125311 +0000 UTC m=+1347.434224123" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.524674 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.524722 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.524733 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5r2g\" (UniqueName: \"kubernetes.io/projected/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-kube-api-access-c5r2g\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.524743 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.524752 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899027b7-067b-4ce1-a8f1-deaee627aa51-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.524762 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbjzw\" (UniqueName: \"kubernetes.io/projected/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-api-access-mbjzw\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.537665 4750 generic.go:334] "Generic (PLEG): container finished" podID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerID="7167710fc93b29300ba5e867d0ed8d94f5a2083a6597592a2450ba7cc525c554" exitCode=0 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.537779 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2aacea2e-e630-4280-8bed-b3b13b67f8ae","Type":"ContainerDied","Data":"7167710fc93b29300ba5e867d0ed8d94f5a2083a6597592a2450ba7cc525c554"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.581714 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "899027b7-067b-4ce1-a8f1-deaee627aa51" (UID: "899027b7-067b-4ce1-a8f1-deaee627aa51"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.584446 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-scripts" (OuterVolumeSpecName: "scripts") pod "899027b7-067b-4ce1-a8f1-deaee627aa51" (UID: "899027b7-067b-4ce1-a8f1-deaee627aa51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.601997 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899027b7-067b-4ce1-a8f1-deaee627aa51-kube-api-access-sxdmc" (OuterVolumeSpecName: "kube-api-access-sxdmc") pod "899027b7-067b-4ce1-a8f1-deaee627aa51" (UID: "899027b7-067b-4ce1-a8f1-deaee627aa51"). InnerVolumeSpecName "kube-api-access-sxdmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.610832 4750 generic.go:334] "Generic (PLEG): container finished" podID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerID="3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227" exitCode=0 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.610926 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"899027b7-067b-4ce1-a8f1-deaee627aa51","Type":"ContainerDied","Data":"3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.611055 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.620728 4750 generic.go:334] "Generic (PLEG): container finished" podID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerID="e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7" exitCode=0 Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.620780 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f1b5ad2e-1ee1-4955-99c2-8daed456b21c","Type":"ContainerDied","Data":"e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.620800 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f1b5ad2e-1ee1-4955-99c2-8daed456b21c","Type":"ContainerDied","Data":"ff15fd208575a05ef4944ca42fa9ef211d8a8fc43f95afbae8019c7bbc5ec593"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.620848 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.626839 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.626875 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxdmc\" (UniqueName: \"kubernetes.io/projected/899027b7-067b-4ce1-a8f1-deaee627aa51-kube-api-access-sxdmc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.626914 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.635812 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.636420 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788b97745d-6snpn" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.636578 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788b97745d-6snpn" event={"ID":"ec1950dc-6caf-45f7-9b18-8c12db1b3f25","Type":"ContainerDied","Data":"5d1d700aa26055d00df0ee9d9c502e69727e064e2128f09053507cb709ba3f90"} Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.636621 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4c02-account-delete-z4mhj" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.808210 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.845494 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.880181 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.954992 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.955950 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "899027b7-067b-4ce1-a8f1-deaee627aa51" (UID: "899027b7-067b-4ce1-a8f1-deaee627aa51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.989682 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" (UID: "ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:11 crc kubenswrapper[4750]: I1008 18:33:11.999590 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" (UID: "ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.002934 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.056990 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.057019 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.057029 4750 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.057040 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.135784 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2feb2439-d911-4585-a5e1-671abcfa357d" (UID: "2feb2439-d911-4585-a5e1-671abcfa357d"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.160476 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-config-data" (OuterVolumeSpecName: "config-data") pod "ec1950dc-6caf-45f7-9b18-8c12db1b3f25" (UID: "ec1950dc-6caf-45f7-9b18-8c12db1b3f25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.160915 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-config-data\") pod \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\" (UID: \"ec1950dc-6caf-45f7-9b18-8c12db1b3f25\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.161591 4750 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2feb2439-d911-4585-a5e1-671abcfa357d-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: W1008 18:33:12.164324 4750 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ec1950dc-6caf-45f7-9b18-8c12db1b3f25/volumes/kubernetes.io~secret/config-data Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.164399 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-config-data" (OuterVolumeSpecName: "config-data") pod "ec1950dc-6caf-45f7-9b18-8c12db1b3f25" (UID: "ec1950dc-6caf-45f7-9b18-8c12db1b3f25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.175790 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.179381 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec1950dc-6caf-45f7-9b18-8c12db1b3f25" (UID: "ec1950dc-6caf-45f7-9b18-8c12db1b3f25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.222404 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" (UID: "ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.244120 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "899027b7-067b-4ce1-a8f1-deaee627aa51" (UID: "899027b7-067b-4ce1-a8f1-deaee627aa51"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.245351 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.257016 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data" (OuterVolumeSpecName: "config-data") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.266363 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.266395 4750 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.266405 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.266414 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.266422 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.266431 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.266439 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.267137 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f1b5ad2e-1ee1-4955-99c2-8daed456b21c" (UID: "f1b5ad2e-1ee1-4955-99c2-8daed456b21c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.294757 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-config-data" (OuterVolumeSpecName: "config-data") pod "899027b7-067b-4ce1-a8f1-deaee627aa51" (UID: "899027b7-067b-4ce1-a8f1-deaee627aa51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.300898 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ec1950dc-6caf-45f7-9b18-8c12db1b3f25" (UID: "ec1950dc-6caf-45f7-9b18-8c12db1b3f25"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.303752 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ec1950dc-6caf-45f7-9b18-8c12db1b3f25" (UID: "ec1950dc-6caf-45f7-9b18-8c12db1b3f25"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.368805 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.368842 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b5ad2e-1ee1-4955-99c2-8daed456b21c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.368852 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899027b7-067b-4ce1-a8f1-deaee627aa51-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.368876 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1950dc-6caf-45f7-9b18-8c12db1b3f25-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.389664 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0 is running failed: container process not found" containerID="7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.393690 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0 is running failed: container process not found" containerID="7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.396117 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0 is running failed: container process not found" containerID="7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.396172 4750 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="038b3881-b266-4878-b395-87d7bf986446" containerName="nova-cell0-conductor-conductor" Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.512529 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="238e771ab2028f72d92800deb7a9dbb1d35bb3e07296801734338f1c8ed278bd" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.514720 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="238e771ab2028f72d92800deb7a9dbb1d35bb3e07296801734338f1c8ed278bd" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.516018 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="238e771ab2028f72d92800deb7a9dbb1d35bb3e07296801734338f1c8ed278bd" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.516047 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerName="galera" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.649391 4750 generic.go:334] "Generic (PLEG): container finished" podID="fe7385d5-3c78-4238-96be-78392eddee4b" containerID="19ff8cd5ce4ea2ca5a48e87f8b09c6d1da6004603a55895cb9163299bb16a295" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.649453 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerDied","Data":"19ff8cd5ce4ea2ca5a48e87f8b09c6d1da6004603a55895cb9163299bb16a295"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.649477 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7385d5-3c78-4238-96be-78392eddee4b","Type":"ContainerDied","Data":"769d822cdc11463d5e21060d884a2f91d15ce758b94be8b5865d2a388641f9e6"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.649487 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="769d822cdc11463d5e21060d884a2f91d15ce758b94be8b5865d2a388641f9e6" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.652472 4750 generic.go:334] "Generic (PLEG): container finished" podID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerID="b88fc77d21b840724590332b8c3c0a46a81b3b88285cff5ea88ef4c54f55690c" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.652516 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" event={"ID":"9ecf0d73-0ca5-4124-93fb-348f8769c2e2","Type":"ContainerDied","Data":"b88fc77d21b840724590332b8c3c0a46a81b3b88285cff5ea88ef4c54f55690c"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.652535 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" event={"ID":"9ecf0d73-0ca5-4124-93fb-348f8769c2e2","Type":"ContainerDied","Data":"1b2e6a6e7dab77f2b8e9865e8ccf080c1670a2e60175da1776664031b1acd2b1"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.652543 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b2e6a6e7dab77f2b8e9865e8ccf080c1670a2e60175da1776664031b1acd2b1" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.655784 4750 generic.go:334] "Generic (PLEG): container finished" podID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerID="919a9750bdfbd9cb9a2ec586476afa3486a7c426aa6ae6693b2697ba44404b4a" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.655849 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e027e860-d0c0-4b1b-b02b-c374d92ae115","Type":"ContainerDied","Data":"919a9750bdfbd9cb9a2ec586476afa3486a7c426aa6ae6693b2697ba44404b4a"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.655870 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e027e860-d0c0-4b1b-b02b-c374d92ae115","Type":"ContainerDied","Data":"44ad1c9d1b67d5647706f1f61c9937d24a45b59055b837ad4020d8c43132aa71"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.655880 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44ad1c9d1b67d5647706f1f61c9937d24a45b59055b837ad4020d8c43132aa71" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.657692 4750 generic.go:334] "Generic (PLEG): container finished" podID="ed7b2661-13bd-4ab4-a92d-7bc382cd257e" containerID="c099682680f99d7d836a4647a357e3b0557699173c713817a786fb3c6c7a1d59" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.657799 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ed7b2661-13bd-4ab4-a92d-7bc382cd257e","Type":"ContainerDied","Data":"c099682680f99d7d836a4647a357e3b0557699173c713817a786fb3c6c7a1d59"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.657828 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ed7b2661-13bd-4ab4-a92d-7bc382cd257e","Type":"ContainerDied","Data":"126b5ab36bd9d3388c9838ad02869649f2369955b2465b5437361ef61c300edb"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.657839 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="126b5ab36bd9d3388c9838ad02869649f2369955b2465b5437361ef61c300edb" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.681888 4750 generic.go:334] "Generic (PLEG): container finished" podID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerID="531a039dc690be8ef20b0b6f5062a698798a803816f610a232671935cec2a8cc" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.681926 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60379ea9-0750-4de0-9d3b-13af080eea8f","Type":"ContainerDied","Data":"531a039dc690be8ef20b0b6f5062a698798a803816f610a232671935cec2a8cc"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.685243 4750 generic.go:334] "Generic (PLEG): container finished" podID="dba17973-3023-43ae-9b75-a8e1dc7f16cc" containerID="295b9725591cb856aaa6df5c90337486a2c9085eabdd2a6955b57c6bb2f78110" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.685323 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dba17973-3023-43ae-9b75-a8e1dc7f16cc","Type":"ContainerDied","Data":"295b9725591cb856aaa6df5c90337486a2c9085eabdd2a6955b57c6bb2f78110"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.685395 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dba17973-3023-43ae-9b75-a8e1dc7f16cc","Type":"ContainerDied","Data":"d6e654b894d5cded9abe8a9c678d1182bce2b85cc38e33fb9bcf87c625b94b36"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.685410 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e654b894d5cded9abe8a9c678d1182bce2b85cc38e33fb9bcf87c625b94b36" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.688446 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"899027b7-067b-4ce1-a8f1-deaee627aa51","Type":"ContainerDied","Data":"a38418e1c7a283b12f176b73e0375e2e95ef5f7c49d26db91f928f456ea00d69"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.699166 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement9f1c-account-delete-wx96m" event={"ID":"0c9b7a5e-9efd-45f8-86bf-730f55c077fd","Type":"ContainerStarted","Data":"1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.699371 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement9f1c-account-delete-wx96m" podUID="0c9b7a5e-9efd-45f8-86bf-730f55c077fd" containerName="mariadb-account-delete" containerID="cri-o://1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca" gracePeriod=30 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.705199 4750 generic.go:334] "Generic (PLEG): container finished" podID="28550569-4c3c-48cf-a621-eddec0919b51" containerID="1a151318fef849fd5d62d1d54d979f4d4b38f1111627d51da48a87b928387362" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.705267 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-587d5f7b59-ws4tc" event={"ID":"28550569-4c3c-48cf-a621-eddec0919b51","Type":"ContainerDied","Data":"1a151318fef849fd5d62d1d54d979f4d4b38f1111627d51da48a87b928387362"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.705331 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-587d5f7b59-ws4tc" event={"ID":"28550569-4c3c-48cf-a621-eddec0919b51","Type":"ContainerDied","Data":"308514aa8ce1ab4cdccf60c938d0902546a0771305aa9481bc5a2695b056de2f"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.705353 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="308514aa8ce1ab4cdccf60c938d0902546a0771305aa9481bc5a2695b056de2f" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.707596 4750 generic.go:334] "Generic (PLEG): container finished" podID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerID="2d2ea6e9b74498814dcf6fb556ced530c433e9440f72f77d7890312774245a65" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.707644 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-594d9fc688-28msd" event={"ID":"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0","Type":"ContainerDied","Data":"2d2ea6e9b74498814dcf6fb556ced530c433e9440f72f77d7890312774245a65"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.707659 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-594d9fc688-28msd" event={"ID":"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0","Type":"ContainerDied","Data":"117039dd02728f4fc08e14b3a5a43fb419d8337ae8cb024f43567ad225a8798b"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.707668 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="117039dd02728f4fc08e14b3a5a43fb419d8337ae8cb024f43567ad225a8798b" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.709924 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5ee7-account-delete-nlrfw" event={"ID":"c751481b-5934-4262-84ff-106498a453e0","Type":"ContainerStarted","Data":"f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.710017 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican5ee7-account-delete-nlrfw" podUID="c751481b-5934-4262-84ff-106498a453e0" containerName="mariadb-account-delete" containerID="cri-o://f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093" gracePeriod=30 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.723502 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a25ebe44-c330-48f8-9df7-5f8517cd96bd","Type":"ContainerDied","Data":"b5d1eb65b8a7dbfa308c5f325abdf0c17ea06ea08ecbd28b4eab4670f53030dd"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.723537 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5d1eb65b8a7dbfa308c5f325abdf0c17ea06ea08ecbd28b4eab4670f53030dd" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.726154 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2aacea2e-e630-4280-8bed-b3b13b67f8ae","Type":"ContainerDied","Data":"0a45b0ade489c99b0ed14a68dc0acee9a4911cb2935ce629816e714af5fe4f88"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.726175 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a45b0ade489c99b0ed14a68dc0acee9a4911cb2935ce629816e714af5fe4f88" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.727805 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement9f1c-account-delete-wx96m" podStartSLOduration=7.727781914 podStartE2EDuration="7.727781914s" podCreationTimestamp="2025-10-08 18:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:33:12.717284497 +0000 UTC m=+1348.630255530" watchObservedRunningTime="2025-10-08 18:33:12.727781914 +0000 UTC m=+1348.640752927" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.735951 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican5ee7-account-delete-nlrfw" podStartSLOduration=7.735932232 podStartE2EDuration="7.735932232s" podCreationTimestamp="2025-10-08 18:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:33:12.734608401 +0000 UTC m=+1348.647579434" watchObservedRunningTime="2025-10-08 18:33:12.735932232 +0000 UTC m=+1348.648903255" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.736467 4750 generic.go:334] "Generic (PLEG): container finished" podID="038b3881-b266-4878-b395-87d7bf986446" containerID="7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0" exitCode=0 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.739382 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapid8db-account-delete-7phjk" podUID="736a9e30-a3de-4e9f-9de7-52015e55443e" containerName="mariadb-account-delete" containerID="cri-o://dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f" gracePeriod=30 Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.751677 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" path="/var/lib/kubelet/pods/235930b1-672c-4fc6-bbb4-78204c591aee/volumes" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.756862 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590d851e-4648-48db-b385-aaa732f5c787" path="/var/lib/kubelet/pods/590d851e-4648-48db-b385-aaa732f5c787/volumes" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.757656 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" path="/var/lib/kubelet/pods/bfa3814f-a0f4-4d53-9c08-44d7b45dd662/volumes" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.758439 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" path="/var/lib/kubelet/pods/c218b865-c7d1-4f46-ad6d-8e102b6af491/volumes" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.758596 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapid8db-account-delete-7phjk" podStartSLOduration=7.758539306 podStartE2EDuration="7.758539306s" podCreationTimestamp="2025-10-08 18:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 18:33:12.755804629 +0000 UTC m=+1348.668775672" watchObservedRunningTime="2025-10-08 18:33:12.758539306 +0000 UTC m=+1348.671510339" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.764421 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4cbc20b-7898-4a47-99f6-80436897042c" path="/var/lib/kubelet/pods/c4cbc20b-7898-4a47-99f6-80436897042c/volumes" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.765152 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc808a1a-9703-4009-8d81-e555a8e25929" path="/var/lib/kubelet/pods/cc808a1a-9703-4009-8d81-e555a8e25929/volumes" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.781922 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"038b3881-b266-4878-b395-87d7bf986446","Type":"ContainerDied","Data":"7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.781968 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapid8db-account-delete-7phjk" event={"ID":"736a9e30-a3de-4e9f-9de7-52015e55443e","Type":"ContainerStarted","Data":"dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f"} Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.807869 4750 scope.go:117] "RemoveContainer" containerID="3f09bf21e5c4d3c261d96fe4ae9e3aa7f61c0a08319b33de766658446000ec50" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.813100 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.885388 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-public-tls-certs\") pod \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.885640 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-combined-ca-bundle\") pod \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.885671 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.885695 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbrnf\" (UniqueName: \"kubernetes.io/projected/a25ebe44-c330-48f8-9df7-5f8517cd96bd-kube-api-access-vbrnf\") pod \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.885716 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-httpd-run\") pod \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.885746 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-logs\") pod \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.885807 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-scripts\") pod \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.885830 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-config-data\") pod \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\" (UID: \"a25ebe44-c330-48f8-9df7-5f8517cd96bd\") " Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.886116 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:12 crc kubenswrapper[4750]: E1008 18:33:12.886166 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data podName:43a52313-747b-40a7-a7e0-9e18f3c97c42 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:20.886153255 +0000 UTC m=+1356.799124268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data") pod "rabbitmq-cell1-server-0" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42") : configmap "rabbitmq-cell1-config-data" not found Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.888160 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-logs" (OuterVolumeSpecName: "logs") pod "a25ebe44-c330-48f8-9df7-5f8517cd96bd" (UID: "a25ebe44-c330-48f8-9df7-5f8517cd96bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.888384 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a25ebe44-c330-48f8-9df7-5f8517cd96bd" (UID: "a25ebe44-c330-48f8-9df7-5f8517cd96bd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.897354 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "a25ebe44-c330-48f8-9df7-5f8517cd96bd" (UID: "a25ebe44-c330-48f8-9df7-5f8517cd96bd"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.898214 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-scripts" (OuterVolumeSpecName: "scripts") pod "a25ebe44-c330-48f8-9df7-5f8517cd96bd" (UID: "a25ebe44-c330-48f8-9df7-5f8517cd96bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.909010 4750 scope.go:117] "RemoveContainer" containerID="2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.924778 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.933733 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25ebe44-c330-48f8-9df7-5f8517cd96bd-kube-api-access-vbrnf" (OuterVolumeSpecName: "kube-api-access-vbrnf") pod "a25ebe44-c330-48f8-9df7-5f8517cd96bd" (UID: "a25ebe44-c330-48f8-9df7-5f8517cd96bd"). InnerVolumeSpecName "kube-api-access-vbrnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.986710 4750 scope.go:117] "RemoveContainer" containerID="2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988081 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-combined-ca-bundle\") pod \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988105 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-config-data\") pod \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988160 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-internal-tls-certs\") pod \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988187 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2aacea2e-e630-4280-8bed-b3b13b67f8ae-logs\") pod \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988216 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp54q\" (UniqueName: \"kubernetes.io/projected/2aacea2e-e630-4280-8bed-b3b13b67f8ae-kube-api-access-dp54q\") pod \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988256 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-public-tls-certs\") pod \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\" (UID: \"2aacea2e-e630-4280-8bed-b3b13b67f8ae\") " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988485 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988507 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988517 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbrnf\" (UniqueName: \"kubernetes.io/projected/a25ebe44-c330-48f8-9df7-5f8517cd96bd-kube-api-access-vbrnf\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988526 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.988534 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a25ebe44-c330-48f8-9df7-5f8517cd96bd-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:12 crc kubenswrapper[4750]: I1008 18:33:12.999810 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aacea2e-e630-4280-8bed-b3b13b67f8ae-logs" (OuterVolumeSpecName: "logs") pod "2aacea2e-e630-4280-8bed-b3b13b67f8ae" (UID: "2aacea2e-e630-4280-8bed-b3b13b67f8ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.009851 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a25ebe44-c330-48f8-9df7-5f8517cd96bd" (UID: "a25ebe44-c330-48f8-9df7-5f8517cd96bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.031275 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aacea2e-e630-4280-8bed-b3b13b67f8ae-kube-api-access-dp54q" (OuterVolumeSpecName: "kube-api-access-dp54q") pod "2aacea2e-e630-4280-8bed-b3b13b67f8ae" (UID: "2aacea2e-e630-4280-8bed-b3b13b67f8ae"). InnerVolumeSpecName "kube-api-access-dp54q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: E1008 18:33:13.039708 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e\": container with ID starting with 2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e not found: ID does not exist" containerID="2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.039754 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e"} err="failed to get container status \"2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e\": rpc error: code = NotFound desc = could not find container \"2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e\": container with ID starting with 2108bb388209a941f9c3535c4f71606c4ec642e6c54810dace8a7237dd2d6c2e not found: ID does not exist" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.039780 4750 scope.go:117] "RemoveContainer" containerID="3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.043729 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.061664 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.076763 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.085515 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.092017 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp54q\" (UniqueName: \"kubernetes.io/projected/2aacea2e-e630-4280-8bed-b3b13b67f8ae-kube-api-access-dp54q\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.092045 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.092057 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.092067 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2aacea2e-e630-4280-8bed-b3b13b67f8ae-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.095474 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.098462 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.113370 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.117306 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.120158 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.125290 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.139326 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-config-data" (OuterVolumeSpecName: "config-data") pod "a25ebe44-c330-48f8-9df7-5f8517cd96bd" (UID: "a25ebe44-c330-48f8-9df7-5f8517cd96bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.155877 4750 scope.go:117] "RemoveContainer" containerID="f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.172512 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201541 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-ceilometer-tls-certs\") pod \"fe7385d5-3c78-4238-96be-78392eddee4b\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201598 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-config-data\") pod \"e027e860-d0c0-4b1b-b02b-c374d92ae115\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201640 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-sg-core-conf-yaml\") pod \"fe7385d5-3c78-4238-96be-78392eddee4b\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201672 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e027e860-d0c0-4b1b-b02b-c374d92ae115-logs\") pod \"e027e860-d0c0-4b1b-b02b-c374d92ae115\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201721 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-memcached-tls-certs\") pod \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201778 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-log-httpd\") pod \"fe7385d5-3c78-4238-96be-78392eddee4b\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201849 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data\") pod \"28550569-4c3c-48cf-a621-eddec0919b51\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201866 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-combined-ca-bundle\") pod \"e027e860-d0c0-4b1b-b02b-c374d92ae115\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201889 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-config-data\") pod \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201906 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klvpg\" (UniqueName: \"kubernetes.io/projected/e027e860-d0c0-4b1b-b02b-c374d92ae115-kube-api-access-klvpg\") pod \"e027e860-d0c0-4b1b-b02b-c374d92ae115\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.201951 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-config-data\") pod \"fe7385d5-3c78-4238-96be-78392eddee4b\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.202021 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-scripts\") pod \"fe7385d5-3c78-4238-96be-78392eddee4b\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.202053 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data-custom\") pod \"28550569-4c3c-48cf-a621-eddec0919b51\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.202089 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8b4s\" (UniqueName: \"kubernetes.io/projected/28550569-4c3c-48cf-a621-eddec0919b51-kube-api-access-l8b4s\") pod \"28550569-4c3c-48cf-a621-eddec0919b51\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.202113 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-combined-ca-bundle\") pod \"fe7385d5-3c78-4238-96be-78392eddee4b\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.202145 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-nova-metadata-tls-certs\") pod \"e027e860-d0c0-4b1b-b02b-c374d92ae115\" (UID: \"e027e860-d0c0-4b1b-b02b-c374d92ae115\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.202438 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-combined-ca-bundle\") pod \"28550569-4c3c-48cf-a621-eddec0919b51\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.202456 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28550569-4c3c-48cf-a621-eddec0919b51-logs\") pod \"28550569-4c3c-48cf-a621-eddec0919b51\" (UID: \"28550569-4c3c-48cf-a621-eddec0919b51\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.202903 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-run-httpd\") pod \"fe7385d5-3c78-4238-96be-78392eddee4b\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.203199 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mfw4\" (UniqueName: \"kubernetes.io/projected/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kube-api-access-2mfw4\") pod \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.203205 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.203345 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a25ebe44-c330-48f8-9df7-5f8517cd96bd" (UID: "a25ebe44-c330-48f8-9df7-5f8517cd96bd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.203395 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2aacea2e-e630-4280-8bed-b3b13b67f8ae" (UID: "2aacea2e-e630-4280-8bed-b3b13b67f8ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.203501 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-combined-ca-bundle\") pod \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.203541 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsnwh\" (UniqueName: \"kubernetes.io/projected/fe7385d5-3c78-4238-96be-78392eddee4b-kube-api-access-hsnwh\") pod \"fe7385d5-3c78-4238-96be-78392eddee4b\" (UID: \"fe7385d5-3c78-4238-96be-78392eddee4b\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.203691 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kolla-config\") pod \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\" (UID: \"ed7b2661-13bd-4ab4-a92d-7bc382cd257e\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.205848 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.206312 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.206323 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25ebe44-c330-48f8-9df7-5f8517cd96bd-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.209120 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ed7b2661-13bd-4ab4-a92d-7bc382cd257e" (UID: "ed7b2661-13bd-4ab4-a92d-7bc382cd257e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.211882 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e027e860-d0c0-4b1b-b02b-c374d92ae115-logs" (OuterVolumeSpecName: "logs") pod "e027e860-d0c0-4b1b-b02b-c374d92ae115" (UID: "e027e860-d0c0-4b1b-b02b-c374d92ae115"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.212794 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.230709 4750 scope.go:117] "RemoveContainer" containerID="e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.230755 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28550569-4c3c-48cf-a621-eddec0919b51-logs" (OuterVolumeSpecName: "logs") pod "28550569-4c3c-48cf-a621-eddec0919b51" (UID: "28550569-4c3c-48cf-a621-eddec0919b51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.231064 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe7385d5-3c78-4238-96be-78392eddee4b" (UID: "fe7385d5-3c78-4238-96be-78392eddee4b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.231929 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe7385d5-3c78-4238-96be-78392eddee4b" (UID: "fe7385d5-3c78-4238-96be-78392eddee4b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.234123 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-scripts" (OuterVolumeSpecName: "scripts") pod "fe7385d5-3c78-4238-96be-78392eddee4b" (UID: "fe7385d5-3c78-4238-96be-78392eddee4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.234188 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.234356 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-config-data" (OuterVolumeSpecName: "config-data") pod "ed7b2661-13bd-4ab4-a92d-7bc382cd257e" (UID: "ed7b2661-13bd-4ab4-a92d-7bc382cd257e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.237163 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e027e860-d0c0-4b1b-b02b-c374d92ae115-kube-api-access-klvpg" (OuterVolumeSpecName: "kube-api-access-klvpg") pod "e027e860-d0c0-4b1b-b02b-c374d92ae115" (UID: "e027e860-d0c0-4b1b-b02b-c374d92ae115"). InnerVolumeSpecName "kube-api-access-klvpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.248058 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2aacea2e-e630-4280-8bed-b3b13b67f8ae" (UID: "2aacea2e-e630-4280-8bed-b3b13b67f8ae"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.248224 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kube-api-access-2mfw4" (OuterVolumeSpecName: "kube-api-access-2mfw4") pod "ed7b2661-13bd-4ab4-a92d-7bc382cd257e" (UID: "ed7b2661-13bd-4ab4-a92d-7bc382cd257e"). InnerVolumeSpecName "kube-api-access-2mfw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.248385 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28550569-4c3c-48cf-a621-eddec0919b51-kube-api-access-l8b4s" (OuterVolumeSpecName: "kube-api-access-l8b4s") pod "28550569-4c3c-48cf-a621-eddec0919b51" (UID: "28550569-4c3c-48cf-a621-eddec0919b51"). InnerVolumeSpecName "kube-api-access-l8b4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.249094 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28550569-4c3c-48cf-a621-eddec0919b51" (UID: "28550569-4c3c-48cf-a621-eddec0919b51"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.255758 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe7385d5-3c78-4238-96be-78392eddee4b-kube-api-access-hsnwh" (OuterVolumeSpecName: "kube-api-access-hsnwh") pod "fe7385d5-3c78-4238-96be-78392eddee4b" (UID: "fe7385d5-3c78-4238-96be-78392eddee4b"). InnerVolumeSpecName "kube-api-access-hsnwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.266831 4750 scope.go:117] "RemoveContainer" containerID="ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.271895 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.280847 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-config-data" (OuterVolumeSpecName: "config-data") pod "2aacea2e-e630-4280-8bed-b3b13b67f8ae" (UID: "2aacea2e-e630-4280-8bed-b3b13b67f8ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.301969 4750 scope.go:117] "RemoveContainer" containerID="e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.303564 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 18:33:13 crc kubenswrapper[4750]: E1008 18:33:13.302522 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7\": container with ID starting with e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7 not found: ID does not exist" containerID="e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.308629 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7"} err="failed to get container status \"e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7\": rpc error: code = NotFound desc = could not find container \"e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7\": container with ID starting with e2474855cd1e381a74ca688d20c9d1a36e8ad10d4315f341b6812d9c8681f6e7 not found: ID does not exist" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.308733 4750 scope.go:117] "RemoveContainer" containerID="ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.308490 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bvf6\" (UniqueName: \"kubernetes.io/projected/dba17973-3023-43ae-9b75-a8e1dc7f16cc-kube-api-access-5bvf6\") pod \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.309059 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-combined-ca-bundle\") pod \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.309158 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-logs\") pod \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.309310 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-combined-ca-bundle\") pod \"60379ea9-0750-4de0-9d3b-13af080eea8f\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.309594 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-combined-ca-bundle\") pod \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.309692 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxv7n\" (UniqueName: \"kubernetes.io/projected/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-kube-api-access-hxv7n\") pod \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.309900 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data\") pod \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.310022 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60379ea9-0750-4de0-9d3b-13af080eea8f-etc-machine-id\") pod \"60379ea9-0750-4de0-9d3b-13af080eea8f\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.310154 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data-custom\") pod \"60379ea9-0750-4de0-9d3b-13af080eea8f\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.310343 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-scripts\") pod \"60379ea9-0750-4de0-9d3b-13af080eea8f\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.310482 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7hzp\" (UniqueName: \"kubernetes.io/projected/60379ea9-0750-4de0-9d3b-13af080eea8f-kube-api-access-l7hzp\") pod \"60379ea9-0750-4de0-9d3b-13af080eea8f\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.310611 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data-custom\") pod \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.310763 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data-custom\") pod \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.311108 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-combined-ca-bundle\") pod \"038b3881-b266-4878-b395-87d7bf986446\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.311236 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data\") pod \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.311336 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nh8n\" (UniqueName: \"kubernetes.io/projected/038b3881-b266-4878-b395-87d7bf986446-kube-api-access-8nh8n\") pod \"038b3881-b266-4878-b395-87d7bf986446\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.311429 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-combined-ca-bundle\") pod \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.311533 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-logs\") pod \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.311671 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvgx9\" (UniqueName: \"kubernetes.io/projected/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-kube-api-access-tvgx9\") pod \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\" (UID: \"9ecf0d73-0ca5-4124-93fb-348f8769c2e2\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.311823 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-public-tls-certs\") pod \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.311981 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-internal-tls-certs\") pod \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\" (UID: \"6e0647ee-68b0-4b8b-9bf0-066dc7a274d0\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.312535 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-config-data\") pod \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\" (UID: \"dba17973-3023-43ae-9b75-a8e1dc7f16cc\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.312691 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-config-data\") pod \"038b3881-b266-4878-b395-87d7bf986446\" (UID: \"038b3881-b266-4878-b395-87d7bf986446\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.312847 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data\") pod \"60379ea9-0750-4de0-9d3b-13af080eea8f\" (UID: \"60379ea9-0750-4de0-9d3b-13af080eea8f\") " Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.328610 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-logs" (OuterVolumeSpecName: "logs") pod "9ecf0d73-0ca5-4124-93fb-348f8769c2e2" (UID: "9ecf0d73-0ca5-4124-93fb-348f8769c2e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.331032 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-logs" (OuterVolumeSpecName: "logs") pod "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" (UID: "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.340282 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2aacea2e-e630-4280-8bed-b3b13b67f8ae" (UID: "2aacea2e-e630-4280-8bed-b3b13b67f8ae"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.346190 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-scripts" (OuterVolumeSpecName: "scripts") pod "60379ea9-0750-4de0-9d3b-13af080eea8f" (UID: "60379ea9-0750-4de0-9d3b-13af080eea8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.346306 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dba17973-3023-43ae-9b75-a8e1dc7f16cc-kube-api-access-5bvf6" (OuterVolumeSpecName: "kube-api-access-5bvf6") pod "dba17973-3023-43ae-9b75-a8e1dc7f16cc" (UID: "dba17973-3023-43ae-9b75-a8e1dc7f16cc"). InnerVolumeSpecName "kube-api-access-5bvf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.347400 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60379ea9-0750-4de0-9d3b-13af080eea8f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "60379ea9-0750-4de0-9d3b-13af080eea8f" (UID: "60379ea9-0750-4de0-9d3b-13af080eea8f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: E1008 18:33:13.348252 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4\": container with ID starting with ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4 not found: ID does not exist" containerID="ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.348294 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4"} err="failed to get container status \"ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4\": rpc error: code = NotFound desc = could not find container \"ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4\": container with ID starting with ea1294fdb6395353dc77235f2f999e3289f6e83d82e7aa71233fb0228a7f37d4 not found: ID does not exist" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.348324 4750 scope.go:117] "RemoveContainer" containerID="f46e6c1f5784d1ac90a72cce013512954360b873208916edc44bc5574486392e" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.368916 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038b3881-b266-4878-b395-87d7bf986446-kube-api-access-8nh8n" (OuterVolumeSpecName: "kube-api-access-8nh8n") pod "038b3881-b266-4878-b395-87d7bf986446" (UID: "038b3881-b266-4878-b395-87d7bf986446"). InnerVolumeSpecName "kube-api-access-8nh8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.373867 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-788b97745d-6snpn"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375275 4750 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375300 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bvf6\" (UniqueName: \"kubernetes.io/projected/dba17973-3023-43ae-9b75-a8e1dc7f16cc-kube-api-access-5bvf6\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375310 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375331 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e027e860-d0c0-4b1b-b02b-c374d92ae115-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375340 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375349 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375358 4750 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60379ea9-0750-4de0-9d3b-13af080eea8f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375372 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375380 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klvpg\" (UniqueName: \"kubernetes.io/projected/e027e860-d0c0-4b1b-b02b-c374d92ae115-kube-api-access-klvpg\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375389 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375397 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375408 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375416 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nh8n\" (UniqueName: \"kubernetes.io/projected/038b3881-b266-4878-b395-87d7bf986446-kube-api-access-8nh8n\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375425 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375434 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8b4s\" (UniqueName: \"kubernetes.io/projected/28550569-4c3c-48cf-a621-eddec0919b51-kube-api-access-l8b4s\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375445 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375452 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28550569-4c3c-48cf-a621-eddec0919b51-logs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375461 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7385d5-3c78-4238-96be-78392eddee4b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375469 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mfw4\" (UniqueName: \"kubernetes.io/projected/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-kube-api-access-2mfw4\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375481 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aacea2e-e630-4280-8bed-b3b13b67f8ae-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.375490 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsnwh\" (UniqueName: \"kubernetes.io/projected/fe7385d5-3c78-4238-96be-78392eddee4b-kube-api-access-hsnwh\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.385336 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-kube-api-access-tvgx9" (OuterVolumeSpecName: "kube-api-access-tvgx9") pod "9ecf0d73-0ca5-4124-93fb-348f8769c2e2" (UID: "9ecf0d73-0ca5-4124-93fb-348f8769c2e2"). InnerVolumeSpecName "kube-api-access-tvgx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.396958 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-788b97745d-6snpn"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.398409 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9ecf0d73-0ca5-4124-93fb-348f8769c2e2" (UID: "9ecf0d73-0ca5-4124-93fb-348f8769c2e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.398446 4750 scope.go:117] "RemoveContainer" containerID="945094b64faf05a196a97d662a4c3e8b64a2ebe7a2311be55ef9549fcca90849" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.418787 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60379ea9-0750-4de0-9d3b-13af080eea8f" (UID: "60379ea9-0750-4de0-9d3b-13af080eea8f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.453411 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60379ea9-0750-4de0-9d3b-13af080eea8f-kube-api-access-l7hzp" (OuterVolumeSpecName: "kube-api-access-l7hzp") pod "60379ea9-0750-4de0-9d3b-13af080eea8f" (UID: "60379ea9-0750-4de0-9d3b-13af080eea8f"). InnerVolumeSpecName "kube-api-access-l7hzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.455695 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" (UID: "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.455715 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-kube-api-access-hxv7n" (OuterVolumeSpecName: "kube-api-access-hxv7n") pod "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" (UID: "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0"). InnerVolumeSpecName "kube-api-access-hxv7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.455725 4750 scope.go:117] "RemoveContainer" containerID="3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227" Oct 08 18:33:13 crc kubenswrapper[4750]: E1008 18:33:13.458028 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227\": container with ID starting with 3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227 not found: ID does not exist" containerID="3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.458064 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227"} err="failed to get container status \"3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227\": rpc error: code = NotFound desc = could not find container \"3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227\": container with ID starting with 3d510e42701bfd35cfca1c00a2144607ca13cee7addd057debd015b58c8b3227 not found: ID does not exist" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.458087 4750 scope.go:117] "RemoveContainer" containerID="f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.458172 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4c02-account-delete-z4mhj"] Oct 08 18:33:13 crc kubenswrapper[4750]: E1008 18:33:13.458467 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9\": container with ID starting with f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9 not found: ID does not exist" containerID="f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.458485 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9"} err="failed to get container status \"f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9\": rpc error: code = NotFound desc = could not find container \"f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9\": container with ID starting with f9e9d1fec0d6d33d2df5f63b0549ef2ff909ed2e48e6665827f62dfd948e7ac9 not found: ID does not exist" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.468044 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance4c02-account-delete-z4mhj"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.472836 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-config-data" (OuterVolumeSpecName: "config-data") pod "e027e860-d0c0-4b1b-b02b-c374d92ae115" (UID: "e027e860-d0c0-4b1b-b02b-c374d92ae115"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.482932 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxv7n\" (UniqueName: \"kubernetes.io/projected/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-kube-api-access-hxv7n\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.483047 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.483086 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7hzp\" (UniqueName: \"kubernetes.io/projected/60379ea9-0750-4de0-9d3b-13af080eea8f-kube-api-access-l7hzp\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.483099 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.483111 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.483346 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvgx9\" (UniqueName: \"kubernetes.io/projected/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-kube-api-access-tvgx9\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.483362 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: E1008 18:33:13.483461 4750 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 18:33:13 crc kubenswrapper[4750]: E1008 18:33:13.483538 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data podName:5b8108eb-834c-44bd-9f39-70c348388ab6 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:21.483518977 +0000 UTC m=+1357.396489990 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data") pod "rabbitmq-server-0" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6") : configmap "rabbitmq-config-data" not found Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.518154 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.523413 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.569178 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e027e860-d0c0-4b1b-b02b-c374d92ae115" (UID: "e027e860-d0c0-4b1b-b02b-c374d92ae115"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.584474 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.598943 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-config-data" (OuterVolumeSpecName: "config-data") pod "dba17973-3023-43ae-9b75-a8e1dc7f16cc" (UID: "dba17973-3023-43ae-9b75-a8e1dc7f16cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.600445 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ecf0d73-0ca5-4124-93fb-348f8769c2e2" (UID: "9ecf0d73-0ca5-4124-93fb-348f8769c2e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.604458 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed7b2661-13bd-4ab4-a92d-7bc382cd257e" (UID: "ed7b2661-13bd-4ab4-a92d-7bc382cd257e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.617074 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe7385d5-3c78-4238-96be-78392eddee4b" (UID: "fe7385d5-3c78-4238-96be-78392eddee4b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.653111 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28550569-4c3c-48cf-a621-eddec0919b51" (UID: "28550569-4c3c-48cf-a621-eddec0919b51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.665723 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data" (OuterVolumeSpecName: "config-data") pod "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" (UID: "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.672616 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-config-data" (OuterVolumeSpecName: "config-data") pod "038b3881-b266-4878-b395-87d7bf986446" (UID: "038b3881-b266-4878-b395-87d7bf986446"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.674295 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" (UID: "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.674354 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "038b3881-b266-4878-b395-87d7bf986446" (UID: "038b3881-b266-4878-b395-87d7bf986446"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.675715 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e027e860-d0c0-4b1b-b02b-c374d92ae115" (UID: "e027e860-d0c0-4b1b-b02b-c374d92ae115"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.686340 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.686367 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.686376 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.686384 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.686843 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.686853 4750 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e027e860-d0c0-4b1b-b02b-c374d92ae115-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.686992 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.687003 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.687011 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038b3881-b266-4878-b395-87d7bf986446-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.687020 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.692435 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data" (OuterVolumeSpecName: "config-data") pod "9ecf0d73-0ca5-4124-93fb-348f8769c2e2" (UID: "9ecf0d73-0ca5-4124-93fb-348f8769c2e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.693682 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "ed7b2661-13bd-4ab4-a92d-7bc382cd257e" (UID: "ed7b2661-13bd-4ab4-a92d-7bc382cd257e"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.709216 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" (UID: "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.732671 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60379ea9-0750-4de0-9d3b-13af080eea8f" (UID: "60379ea9-0750-4de0-9d3b-13af080eea8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.732736 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dba17973-3023-43ae-9b75-a8e1dc7f16cc" (UID: "dba17973-3023-43ae-9b75-a8e1dc7f16cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.737965 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" (UID: "6e0647ee-68b0-4b8b-9bf0-066dc7a274d0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.745771 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fe7385d5-3c78-4238-96be-78392eddee4b" (UID: "fe7385d5-3c78-4238-96be-78392eddee4b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.754522 4750 generic.go:334] "Generic (PLEG): container finished" podID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerID="238e771ab2028f72d92800deb7a9dbb1d35bb3e07296801734338f1c8ed278bd" exitCode=0 Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.754617 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b","Type":"ContainerDied","Data":"238e771ab2028f72d92800deb7a9dbb1d35bb3e07296801734338f1c8ed278bd"} Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.755936 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data" (OuterVolumeSpecName: "config-data") pod "60379ea9-0750-4de0-9d3b-13af080eea8f" (UID: "60379ea9-0750-4de0-9d3b-13af080eea8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.757014 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60379ea9-0750-4de0-9d3b-13af080eea8f","Type":"ContainerDied","Data":"9206ef57f28e2ee895a8e287047555fefbdb8ec79245d11843edf6b4acf69881"} Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.757067 4750 scope.go:117] "RemoveContainer" containerID="be50775e9b6d25d8067aa8220a05db2eb7a3fed17a5981e8b008927ca24a65a9" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.757241 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.758219 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data" (OuterVolumeSpecName: "config-data") pod "28550569-4c3c-48cf-a621-eddec0919b51" (UID: "28550569-4c3c-48cf-a621-eddec0919b51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.767199 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"038b3881-b266-4878-b395-87d7bf986446","Type":"ContainerDied","Data":"58b8db3b53f1a400f73a597c0554b6ffbc1c1d9f9071d20e55d07f52500acfd4"} Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.767298 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.776413 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_230c02f8-af60-40d6-af19-adf730eec43f/ovn-northd/0.log" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.776455 4750 generic.go:334] "Generic (PLEG): container finished" podID="230c02f8-af60-40d6-af19-adf730eec43f" containerID="c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" exitCode=139 Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.776499 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"230c02f8-af60-40d6-af19-adf730eec43f","Type":"ContainerDied","Data":"c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc"} Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777449 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777622 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777645 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-594d9fc688-28msd" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777765 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-689cf77786-nkzv6" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777773 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777786 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-587d5f7b59-ws4tc" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777792 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777823 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.777840 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.789587 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.789620 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.789632 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.789644 4750 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.789656 4750 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7b2661-13bd-4ab4-a92d-7bc382cd257e-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.789667 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60379ea9-0750-4de0-9d3b-13af080eea8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.789678 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28550569-4c3c-48cf-a621-eddec0919b51-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.791658 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ecf0d73-0ca5-4124-93fb-348f8769c2e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.791711 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba17973-3023-43ae-9b75-a8e1dc7f16cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.833070 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-config-data" (OuterVolumeSpecName: "config-data") pod "fe7385d5-3c78-4238-96be-78392eddee4b" (UID: "fe7385d5-3c78-4238-96be-78392eddee4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.850491 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe7385d5-3c78-4238-96be-78392eddee4b" (UID: "fe7385d5-3c78-4238-96be-78392eddee4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.893375 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.893404 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7385d5-3c78-4238-96be-78392eddee4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:13 crc kubenswrapper[4750]: I1008 18:33:13.948187 4750 scope.go:117] "RemoveContainer" containerID="531a039dc690be8ef20b0b6f5062a698798a803816f610a232671935cec2a8cc" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.012354 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_230c02f8-af60-40d6-af19-adf730eec43f/ovn-northd/0.log" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.012672 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.020466 4750 scope.go:117] "RemoveContainer" containerID="7b92d9e428bd59e74196a828e532527c2bee6127a6c6018b44f016e271d5efb0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.069246 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-587d5f7b59-ws4tc"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.076099 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-587d5f7b59-ws4tc"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.091493 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.095871 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.097862 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-config\") pod \"230c02f8-af60-40d6-af19-adf730eec43f\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.097954 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/230c02f8-af60-40d6-af19-adf730eec43f-ovn-rundir\") pod \"230c02f8-af60-40d6-af19-adf730eec43f\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.097996 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-combined-ca-bundle\") pod \"230c02f8-af60-40d6-af19-adf730eec43f\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.098078 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-scripts\") pod \"230c02f8-af60-40d6-af19-adf730eec43f\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.098116 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-metrics-certs-tls-certs\") pod \"230c02f8-af60-40d6-af19-adf730eec43f\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.098137 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlfz8\" (UniqueName: \"kubernetes.io/projected/230c02f8-af60-40d6-af19-adf730eec43f-kube-api-access-xlfz8\") pod \"230c02f8-af60-40d6-af19-adf730eec43f\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.098165 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-ovn-northd-tls-certs\") pod \"230c02f8-af60-40d6-af19-adf730eec43f\" (UID: \"230c02f8-af60-40d6-af19-adf730eec43f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.101456 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-config" (OuterVolumeSpecName: "config") pod "230c02f8-af60-40d6-af19-adf730eec43f" (UID: "230c02f8-af60-40d6-af19-adf730eec43f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.102985 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.103212 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-scripts" (OuterVolumeSpecName: "scripts") pod "230c02f8-af60-40d6-af19-adf730eec43f" (UID: "230c02f8-af60-40d6-af19-adf730eec43f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.103625 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/230c02f8-af60-40d6-af19-adf730eec43f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "230c02f8-af60-40d6-af19-adf730eec43f" (UID: "230c02f8-af60-40d6-af19-adf730eec43f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.105069 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.105823 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230c02f8-af60-40d6-af19-adf730eec43f-kube-api-access-xlfz8" (OuterVolumeSpecName: "kube-api-access-xlfz8") pod "230c02f8-af60-40d6-af19-adf730eec43f" (UID: "230c02f8-af60-40d6-af19-adf730eec43f"). InnerVolumeSpecName "kube-api-access-xlfz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.109954 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-594d9fc688-28msd"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.117287 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-594d9fc688-28msd"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.122429 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "230c02f8-af60-40d6-af19-adf730eec43f" (UID: "230c02f8-af60-40d6-af19-adf730eec43f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.126757 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.130194 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.136721 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.140999 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.157992 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.163263 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.182640 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-689cf77786-nkzv6"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.200480 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlfz8\" (UniqueName: \"kubernetes.io/projected/230c02f8-af60-40d6-af19-adf730eec43f-kube-api-access-xlfz8\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.200510 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.200522 4750 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/230c02f8-af60-40d6-af19-adf730eec43f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.200532 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.200541 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/230c02f8-af60-40d6-af19-adf730eec43f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.210072 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-689cf77786-nkzv6"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.212717 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "230c02f8-af60-40d6-af19-adf730eec43f" (UID: "230c02f8-af60-40d6-af19-adf730eec43f"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.215291 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "230c02f8-af60-40d6-af19-adf730eec43f" (UID: "230c02f8-af60-40d6-af19-adf730eec43f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.216633 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.225044 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.229540 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.233875 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.246715 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.258474 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.301851 4750 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.301877 4750 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/230c02f8-af60-40d6-af19-adf730eec43f-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.321171 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.403955 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404030 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-server-conf\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404055 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-erlang-cookie\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404111 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43a52313-747b-40a7-a7e0-9e18f3c97c42-pod-info\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404137 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43a52313-747b-40a7-a7e0-9e18f3c97c42-erlang-cookie-secret\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404171 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-plugins\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404244 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-plugins-conf\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404261 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-tls\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404275 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-confd\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404301 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npbvh\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-kube-api-access-npbvh\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.404320 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data\") pod \"43a52313-747b-40a7-a7e0-9e18f3c97c42\" (UID: \"43a52313-747b-40a7-a7e0-9e18f3c97c42\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.408655 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.409037 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.410935 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.412775 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a52313-747b-40a7-a7e0-9e18f3c97c42-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.412992 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.415857 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-kube-api-access-npbvh" (OuterVolumeSpecName: "kube-api-access-npbvh") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "kube-api-access-npbvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.418240 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/43a52313-747b-40a7-a7e0-9e18f3c97c42-pod-info" (OuterVolumeSpecName: "pod-info") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.427646 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.437134 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data" (OuterVolumeSpecName: "config-data") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.461076 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.487376 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-server-conf" (OuterVolumeSpecName: "server-conf") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505731 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505761 4750 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505772 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505781 4750 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43a52313-747b-40a7-a7e0-9e18f3c97c42-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505789 4750 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43a52313-747b-40a7-a7e0-9e18f3c97c42-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505798 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505806 4750 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505814 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505822 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43a52313-747b-40a7-a7e0-9e18f3c97c42-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.505829 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npbvh\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-kube-api-access-npbvh\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.512353 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "43a52313-747b-40a7-a7e0-9e18f3c97c42" (UID: "43a52313-747b-40a7-a7e0-9e18f3c97c42"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.539199 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607084 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-internal-tls-certs\") pod \"2f22ab58-6189-4321-b660-ed992f6fb70f\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607155 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-credential-keys\") pod \"2f22ab58-6189-4321-b660-ed992f6fb70f\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607206 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-scripts\") pod \"2f22ab58-6189-4321-b660-ed992f6fb70f\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607237 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-fernet-keys\") pod \"2f22ab58-6189-4321-b660-ed992f6fb70f\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607287 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v95ns\" (UniqueName: \"kubernetes.io/projected/2f22ab58-6189-4321-b660-ed992f6fb70f-kube-api-access-v95ns\") pod \"2f22ab58-6189-4321-b660-ed992f6fb70f\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607334 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-public-tls-certs\") pod \"2f22ab58-6189-4321-b660-ed992f6fb70f\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607374 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-config-data\") pod \"2f22ab58-6189-4321-b660-ed992f6fb70f\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607414 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-combined-ca-bundle\") pod \"2f22ab58-6189-4321-b660-ed992f6fb70f\" (UID: \"2f22ab58-6189-4321-b660-ed992f6fb70f\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607799 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43a52313-747b-40a7-a7e0-9e18f3c97c42-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.607817 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.614159 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.621692 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-scripts" (OuterVolumeSpecName: "scripts") pod "2f22ab58-6189-4321-b660-ed992f6fb70f" (UID: "2f22ab58-6189-4321-b660-ed992f6fb70f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.622515 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2f22ab58-6189-4321-b660-ed992f6fb70f" (UID: "2f22ab58-6189-4321-b660-ed992f6fb70f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.622805 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f22ab58-6189-4321-b660-ed992f6fb70f-kube-api-access-v95ns" (OuterVolumeSpecName: "kube-api-access-v95ns") pod "2f22ab58-6189-4321-b660-ed992f6fb70f" (UID: "2f22ab58-6189-4321-b660-ed992f6fb70f"). InnerVolumeSpecName "kube-api-access-v95ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.637609 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2f22ab58-6189-4321-b660-ed992f6fb70f" (UID: "2f22ab58-6189-4321-b660-ed992f6fb70f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.639352 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-config-data" (OuterVolumeSpecName: "config-data") pod "2f22ab58-6189-4321-b660-ed992f6fb70f" (UID: "2f22ab58-6189-4321-b660-ed992f6fb70f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.656137 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f22ab58-6189-4321-b660-ed992f6fb70f" (UID: "2f22ab58-6189-4321-b660-ed992f6fb70f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.693357 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2f22ab58-6189-4321-b660-ed992f6fb70f" (UID: "2f22ab58-6189-4321-b660-ed992f6fb70f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.699437 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f22ab58-6189-4321-b660-ed992f6fb70f" (UID: "2f22ab58-6189-4321-b660-ed992f6fb70f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.709231 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-default\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.709343 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-generated\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.709427 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-combined-ca-bundle\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.709503 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-galera-tls-certs\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.709829 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.709658 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-operator-scripts\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.709958 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssp5w\" (UniqueName: \"kubernetes.io/projected/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kube-api-access-ssp5w\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.709997 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710183 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-secrets\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710233 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kolla-config\") pod \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\" (UID: \"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710470 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710827 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710943 4750 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710958 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710967 4750 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710977 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v95ns\" (UniqueName: \"kubernetes.io/projected/2f22ab58-6189-4321-b660-ed992f6fb70f-kube-api-access-v95ns\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710986 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.710995 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.711003 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.711011 4750 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.711019 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.711028 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f22ab58-6189-4321-b660-ed992f6fb70f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.711037 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.711031 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.713917 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-secrets" (OuterVolumeSpecName: "secrets") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.714648 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kube-api-access-ssp5w" (OuterVolumeSpecName: "kube-api-access-ssp5w") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "kube-api-access-ssp5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.721681 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.744462 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.749643 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038b3881-b266-4878-b395-87d7bf986446" path="/var/lib/kubelet/pods/038b3881-b266-4878-b395-87d7bf986446/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.752688 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28550569-4c3c-48cf-a621-eddec0919b51" path="/var/lib/kubelet/pods/28550569-4c3c-48cf-a621-eddec0919b51/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.753569 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" path="/var/lib/kubelet/pods/2aacea2e-e630-4280-8bed-b3b13b67f8ae/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.754503 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2feb2439-d911-4585-a5e1-671abcfa357d" path="/var/lib/kubelet/pods/2feb2439-d911-4585-a5e1-671abcfa357d/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.755836 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" path="/var/lib/kubelet/pods/60379ea9-0750-4de0-9d3b-13af080eea8f/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.756324 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" (UID: "cedba103-c6cd-4e9a-9c7c-80d90aaedb3b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.756849 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" path="/var/lib/kubelet/pods/6e0647ee-68b0-4b8b-9bf0-066dc7a274d0/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.758157 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" path="/var/lib/kubelet/pods/899027b7-067b-4ce1-a8f1-deaee627aa51/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.758158 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.759151 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" path="/var/lib/kubelet/pods/9ecf0d73-0ca5-4124-93fb-348f8769c2e2/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.760016 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" path="/var/lib/kubelet/pods/a25ebe44-c330-48f8-9df7-5f8517cd96bd/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.761346 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" path="/var/lib/kubelet/pods/ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.762116 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b173b167-1fa4-45ec-98d0-16956f4b0b30" path="/var/lib/kubelet/pods/b173b167-1fa4-45ec-98d0-16956f4b0b30/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.762852 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba17973-3023-43ae-9b75-a8e1dc7f16cc" path="/var/lib/kubelet/pods/dba17973-3023-43ae-9b75-a8e1dc7f16cc/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.763997 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" path="/var/lib/kubelet/pods/e027e860-d0c0-4b1b-b02b-c374d92ae115/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.764739 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" path="/var/lib/kubelet/pods/ec1950dc-6caf-45f7-9b18-8c12db1b3f25/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.765451 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7b2661-13bd-4ab4-a92d-7bc382cd257e" path="/var/lib/kubelet/pods/ed7b2661-13bd-4ab4-a92d-7bc382cd257e/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.766616 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" path="/var/lib/kubelet/pods/f1b5ad2e-1ee1-4955-99c2-8daed456b21c/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.767467 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" path="/var/lib/kubelet/pods/fe7385d5-3c78-4238-96be-78392eddee4b/volumes" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.797481 4750 generic.go:334] "Generic (PLEG): container finished" podID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerID="0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd" exitCode=0 Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.797622 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.799850 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43a52313-747b-40a7-a7e0-9e18f3c97c42","Type":"ContainerDied","Data":"0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd"} Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.799895 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43a52313-747b-40a7-a7e0-9e18f3c97c42","Type":"ContainerDied","Data":"4d0c1296afba7cd7122b6bd1c17e5e3380f8de83260c07af67c846218575f097"} Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.799923 4750 scope.go:117] "RemoveContainer" containerID="0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.805872 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_230c02f8-af60-40d6-af19-adf730eec43f/ovn-northd/0.log" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.805942 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"230c02f8-af60-40d6-af19-adf730eec43f","Type":"ContainerDied","Data":"99f52a6ea71a16bfea8afdd0ee3f47f3836723b2c976614726111d0b57bb101f"} Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.806035 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.812067 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-plugins\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.812250 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-erlang-cookie\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.812413 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.812704 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.812759 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.812794 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b8108eb-834c-44bd-9f39-70c348388ab6-erlang-cookie-secret\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813221 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813235 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-tls\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813259 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-server-conf\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813162 4750 generic.go:334] "Generic (PLEG): container finished" podID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerID="521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21" exitCode=0 Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813308 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b8108eb-834c-44bd-9f39-70c348388ab6-pod-info\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813341 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twzdz\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-kube-api-access-twzdz\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813362 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813415 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-confd\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813481 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-plugins-conf\") pod \"5b8108eb-834c-44bd-9f39-70c348388ab6\" (UID: \"5b8108eb-834c-44bd-9f39-70c348388ab6\") " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813188 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b8108eb-834c-44bd-9f39-70c348388ab6","Type":"ContainerDied","Data":"521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21"} Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.813604 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b8108eb-834c-44bd-9f39-70c348388ab6","Type":"ContainerDied","Data":"c6f0fed50a39031c63702b524b07cd549afac88921eb41415d16e3fcb1af8b96"} Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.814047 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssp5w\" (UniqueName: \"kubernetes.io/projected/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-kube-api-access-ssp5w\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.814075 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.814706 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.816075 4750 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.816096 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.816108 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.816118 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.816126 4750 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.816134 4750 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.817430 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-kube-api-access-twzdz" (OuterVolumeSpecName: "kube-api-access-twzdz") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "kube-api-access-twzdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.821349 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.822886 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cedba103-c6cd-4e9a-9c7c-80d90aaedb3b","Type":"ContainerDied","Data":"04bcda7d410b5c881619148884f3f448e73cc4d0d377e86bde6b8a44712dd0c0"} Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.822969 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.827485 4750 generic.go:334] "Generic (PLEG): container finished" podID="2f22ab58-6189-4321-b660-ed992f6fb70f" containerID="e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2" exitCode=0 Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.827744 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-548c7c66b4-b72bl" event={"ID":"2f22ab58-6189-4321-b660-ed992f6fb70f","Type":"ContainerDied","Data":"e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2"} Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.827770 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-548c7c66b4-b72bl" event={"ID":"2f22ab58-6189-4321-b660-ed992f6fb70f","Type":"ContainerDied","Data":"d16b394fd7bb0c54355f4e32ff41cd24ce068c245ebaa22449979758dc656a7f"} Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.827845 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-548c7c66b4-b72bl" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.833646 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5b8108eb-834c-44bd-9f39-70c348388ab6-pod-info" (OuterVolumeSpecName: "pod-info") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.833689 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.836394 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.856335 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data" (OuterVolumeSpecName: "config-data") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: E1008 18:33:14.856486 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.859744 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8108eb-834c-44bd-9f39-70c348388ab6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: E1008 18:33:14.859766 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.859894 4750 scope.go:117] "RemoveContainer" containerID="fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.860397 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: E1008 18:33:14.864951 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 18:33:14 crc kubenswrapper[4750]: E1008 18:33:14.865021 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="af09729b-3284-4dcd-91a1-5763d28daaf5" containerName="nova-cell1-conductor-conductor" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.875921 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.887018 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.887269 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-server-conf" (OuterVolumeSpecName: "server-conf") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.890503 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.894847 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.902897 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.907383 4750 scope.go:117] "RemoveContainer" containerID="0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd" Oct 08 18:33:14 crc kubenswrapper[4750]: E1008 18:33:14.907784 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd\": container with ID starting with 0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd not found: ID does not exist" containerID="0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.907828 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd"} err="failed to get container status \"0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd\": rpc error: code = NotFound desc = could not find container \"0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd\": container with ID starting with 0d4d831252f413ac3fb7facf9f1b774f0cfb6c7ac211742c0a01794b3473bbbd not found: ID does not exist" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.907855 4750 scope.go:117] "RemoveContainer" containerID="fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565" Oct 08 18:33:14 crc kubenswrapper[4750]: E1008 18:33:14.908224 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565\": container with ID starting with fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565 not found: ID does not exist" containerID="fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.908256 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565"} err="failed to get container status \"fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565\": rpc error: code = NotFound desc = could not find container \"fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565\": container with ID starting with fb11e627f4eaf60a3b819ad8d9b435832ac804f82f9736be30305c3e47ac4565 not found: ID does not exist" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.908278 4750 scope.go:117] "RemoveContainer" containerID="36265d856955e91441b681c8365511d7edd515654902aa5545ca450e749f6e36" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.909020 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-548c7c66b4-b72bl"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.912293 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5b8108eb-834c-44bd-9f39-70c348388ab6" (UID: "5b8108eb-834c-44bd-9f39-70c348388ab6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.914233 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-548c7c66b4-b72bl"] Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917070 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twzdz\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-kube-api-access-twzdz\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917110 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917120 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917148 4750 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917158 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917168 4750 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b8108eb-834c-44bd-9f39-70c348388ab6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917177 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b8108eb-834c-44bd-9f39-70c348388ab6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917185 4750 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b8108eb-834c-44bd-9f39-70c348388ab6-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917194 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.917203 4750 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b8108eb-834c-44bd-9f39-70c348388ab6-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.931534 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.936148 4750 scope.go:117] "RemoveContainer" containerID="c92dbcfbb2d09ff0c337921ca5e5f0d269b51741b6e9f5407d3bb68858e0b7bc" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.956966 4750 scope.go:117] "RemoveContainer" containerID="521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21" Oct 08 18:33:14 crc kubenswrapper[4750]: I1008 18:33:14.975039 4750 scope.go:117] "RemoveContainer" containerID="40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.005425 4750 scope.go:117] "RemoveContainer" containerID="521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21" Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.005900 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21\": container with ID starting with 521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21 not found: ID does not exist" containerID="521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.005943 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21"} err="failed to get container status \"521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21\": rpc error: code = NotFound desc = could not find container \"521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21\": container with ID starting with 521a7eb18f85bcac1d99fd7991c03287aa5ae554a583b99a06a05b681ce4be21 not found: ID does not exist" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.005971 4750 scope.go:117] "RemoveContainer" containerID="40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2" Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.006381 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2\": container with ID starting with 40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2 not found: ID does not exist" containerID="40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.006408 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2"} err="failed to get container status \"40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2\": rpc error: code = NotFound desc = could not find container \"40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2\": container with ID starting with 40ab5b6bde85f8c7024b6c550a72f006a95cf43289623b823b0fcdb2413a3ef2 not found: ID does not exist" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.006430 4750 scope.go:117] "RemoveContainer" containerID="238e771ab2028f72d92800deb7a9dbb1d35bb3e07296801734338f1c8ed278bd" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.019423 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.025469 4750 scope.go:117] "RemoveContainer" containerID="7486e6d02d87823ff44073ee25ac3d9ce6156c4080b1bc61d97efc65116f101a" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.044278 4750 scope.go:117] "RemoveContainer" containerID="e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.062914 4750 scope.go:117] "RemoveContainer" containerID="e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2" Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.063255 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2\": container with ID starting with e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2 not found: ID does not exist" containerID="e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.063295 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2"} err="failed to get container status \"e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2\": rpc error: code = NotFound desc = could not find container \"e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2\": container with ID starting with e4b287edf9036a666587e6c4e08dfb587495ff24b4fe7daeaedc5be9ebb8a5b2 not found: ID does not exist" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.145413 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.149729 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.424344 4750 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.424396 4750 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data podName:af09729b-3284-4dcd-91a1-5763d28daaf5 nodeName:}" failed. No retries permitted until 2025-10-08 18:33:23.424383032 +0000 UTC m=+1359.337354045 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data") pod "nova-cell1-conductor-0" (UID: "af09729b-3284-4dcd-91a1-5763d28daaf5") : secret "nova-cell1-conductor-config-data" not found Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.484114 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.488441 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.488531 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.492412 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.492466 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.492538 4750 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.493439 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:15 crc kubenswrapper[4750]: E1008 18:33:15.493475 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.835527 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.197:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 18:33:15 crc kubenswrapper[4750]: I1008 18:33:15.934255 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8d8895f6c-zszml" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.163:9696/\": dial tcp 10.217.0.163:9696: connect: connection refused" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.742489 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="230c02f8-af60-40d6-af19-adf730eec43f" path="/var/lib/kubelet/pods/230c02f8-af60-40d6-af19-adf730eec43f/volumes" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.743344 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f22ab58-6189-4321-b660-ed992f6fb70f" path="/var/lib/kubelet/pods/2f22ab58-6189-4321-b660-ed992f6fb70f/volumes" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.744029 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a52313-747b-40a7-a7e0-9e18f3c97c42" path="/var/lib/kubelet/pods/43a52313-747b-40a7-a7e0-9e18f3c97c42/volumes" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.745203 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8108eb-834c-44bd-9f39-70c348388ab6" path="/var/lib/kubelet/pods/5b8108eb-834c-44bd-9f39-70c348388ab6/volumes" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.745878 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" path="/var/lib/kubelet/pods/cedba103-c6cd-4e9a-9c7c-80d90aaedb3b/volumes" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.797444 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.859021 4750 generic.go:334] "Generic (PLEG): container finished" podID="af09729b-3284-4dcd-91a1-5763d28daaf5" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" exitCode=0 Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.859072 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af09729b-3284-4dcd-91a1-5763d28daaf5","Type":"ContainerDied","Data":"c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839"} Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.859093 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af09729b-3284-4dcd-91a1-5763d28daaf5","Type":"ContainerDied","Data":"5ef135a41ae873ae93db09009b8373bbb16fc3f1f05dbfb10390bcd13e9c641b"} Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.859109 4750 scope.go:117] "RemoveContainer" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.859219 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.892961 4750 scope.go:117] "RemoveContainer" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" Oct 08 18:33:16 crc kubenswrapper[4750]: E1008 18:33:16.893360 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839\": container with ID starting with c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839 not found: ID does not exist" containerID="c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.893388 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839"} err="failed to get container status \"c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839\": rpc error: code = NotFound desc = could not find container \"c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839\": container with ID starting with c89b5ffde0edcb272b86a232392e60a596a9950b2acdc482f24207d4aea9c839 not found: ID does not exist" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.948346 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-combined-ca-bundle\") pod \"af09729b-3284-4dcd-91a1-5763d28daaf5\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.948416 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data\") pod \"af09729b-3284-4dcd-91a1-5763d28daaf5\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.948583 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnpm7\" (UniqueName: \"kubernetes.io/projected/af09729b-3284-4dcd-91a1-5763d28daaf5-kube-api-access-qnpm7\") pod \"af09729b-3284-4dcd-91a1-5763d28daaf5\" (UID: \"af09729b-3284-4dcd-91a1-5763d28daaf5\") " Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.953161 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af09729b-3284-4dcd-91a1-5763d28daaf5-kube-api-access-qnpm7" (OuterVolumeSpecName: "kube-api-access-qnpm7") pod "af09729b-3284-4dcd-91a1-5763d28daaf5" (UID: "af09729b-3284-4dcd-91a1-5763d28daaf5"). InnerVolumeSpecName "kube-api-access-qnpm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.972728 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af09729b-3284-4dcd-91a1-5763d28daaf5" (UID: "af09729b-3284-4dcd-91a1-5763d28daaf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:16 crc kubenswrapper[4750]: I1008 18:33:16.975378 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data" (OuterVolumeSpecName: "config-data") pod "af09729b-3284-4dcd-91a1-5763d28daaf5" (UID: "af09729b-3284-4dcd-91a1-5763d28daaf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:17 crc kubenswrapper[4750]: I1008 18:33:17.050845 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnpm7\" (UniqueName: \"kubernetes.io/projected/af09729b-3284-4dcd-91a1-5763d28daaf5-kube-api-access-qnpm7\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:17 crc kubenswrapper[4750]: I1008 18:33:17.050892 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:17 crc kubenswrapper[4750]: I1008 18:33:17.050912 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af09729b-3284-4dcd-91a1-5763d28daaf5-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:17 crc kubenswrapper[4750]: I1008 18:33:17.183198 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-594d9fc688-28msd" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 18:33:17 crc kubenswrapper[4750]: I1008 18:33:17.183255 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-594d9fc688-28msd" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 18:33:17 crc kubenswrapper[4750]: I1008 18:33:17.187342 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 18:33:17 crc kubenswrapper[4750]: I1008 18:33:17.192822 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 18:33:18 crc kubenswrapper[4750]: I1008 18:33:18.745589 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af09729b-3284-4dcd-91a1-5763d28daaf5" path="/var/lib/kubelet/pods/af09729b-3284-4dcd-91a1-5763d28daaf5/volumes" Oct 08 18:33:20 crc kubenswrapper[4750]: E1008 18:33:20.481913 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:20 crc kubenswrapper[4750]: E1008 18:33:20.482597 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:20 crc kubenswrapper[4750]: E1008 18:33:20.483046 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:20 crc kubenswrapper[4750]: E1008 18:33:20.483281 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:20 crc kubenswrapper[4750]: E1008 18:33:20.483332 4750 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:20 crc kubenswrapper[4750]: E1008 18:33:20.483812 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:20 crc kubenswrapper[4750]: E1008 18:33:20.485976 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:20 crc kubenswrapper[4750]: E1008 18:33:20.486009 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:25 crc kubenswrapper[4750]: E1008 18:33:25.481720 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:25 crc kubenswrapper[4750]: E1008 18:33:25.482814 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:25 crc kubenswrapper[4750]: E1008 18:33:25.484351 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:25 crc kubenswrapper[4750]: E1008 18:33:25.484660 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:25 crc kubenswrapper[4750]: E1008 18:33:25.484695 4750 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:25 crc kubenswrapper[4750]: E1008 18:33:25.485380 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:25 crc kubenswrapper[4750]: E1008 18:33:25.486935 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:25 crc kubenswrapper[4750]: E1008 18:33:25.486976 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.707239 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.707614 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.707661 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.708281 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dabcc140df6267bd6bfe6e96a507eb6f8fc953553e99e9c93846e177880aa4e8"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.708339 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://dabcc140df6267bd6bfe6e96a507eb6f8fc953553e99e9c93846e177880aa4e8" gracePeriod=600 Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.806799 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.952307 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-config\") pod \"7b751e62-8a05-413c-9f82-e9f28230e5ba\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.952382 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-public-tls-certs\") pod \"7b751e62-8a05-413c-9f82-e9f28230e5ba\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.952411 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-internal-tls-certs\") pod \"7b751e62-8a05-413c-9f82-e9f28230e5ba\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.952469 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-combined-ca-bundle\") pod \"7b751e62-8a05-413c-9f82-e9f28230e5ba\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.952535 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-ovndb-tls-certs\") pod \"7b751e62-8a05-413c-9f82-e9f28230e5ba\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.952602 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-httpd-config\") pod \"7b751e62-8a05-413c-9f82-e9f28230e5ba\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.952639 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm7dn\" (UniqueName: \"kubernetes.io/projected/7b751e62-8a05-413c-9f82-e9f28230e5ba-kube-api-access-vm7dn\") pod \"7b751e62-8a05-413c-9f82-e9f28230e5ba\" (UID: \"7b751e62-8a05-413c-9f82-e9f28230e5ba\") " Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.958417 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7b751e62-8a05-413c-9f82-e9f28230e5ba" (UID: "7b751e62-8a05-413c-9f82-e9f28230e5ba"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.958666 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b751e62-8a05-413c-9f82-e9f28230e5ba-kube-api-access-vm7dn" (OuterVolumeSpecName: "kube-api-access-vm7dn") pod "7b751e62-8a05-413c-9f82-e9f28230e5ba" (UID: "7b751e62-8a05-413c-9f82-e9f28230e5ba"). InnerVolumeSpecName "kube-api-access-vm7dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.988914 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7b751e62-8a05-413c-9f82-e9f28230e5ba" (UID: "7b751e62-8a05-413c-9f82-e9f28230e5ba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.993273 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-config" (OuterVolumeSpecName: "config") pod "7b751e62-8a05-413c-9f82-e9f28230e5ba" (UID: "7b751e62-8a05-413c-9f82-e9f28230e5ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:29 crc kubenswrapper[4750]: I1008 18:33:29.997052 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7b751e62-8a05-413c-9f82-e9f28230e5ba" (UID: "7b751e62-8a05-413c-9f82-e9f28230e5ba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.010467 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b751e62-8a05-413c-9f82-e9f28230e5ba" (UID: "7b751e62-8a05-413c-9f82-e9f28230e5ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.012487 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7b751e62-8a05-413c-9f82-e9f28230e5ba" (UID: "7b751e62-8a05-413c-9f82-e9f28230e5ba"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.023325 4750 generic.go:334] "Generic (PLEG): container finished" podID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerID="f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9" exitCode=0 Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.023425 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8d8895f6c-zszml" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.023646 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d8895f6c-zszml" event={"ID":"7b751e62-8a05-413c-9f82-e9f28230e5ba","Type":"ContainerDied","Data":"f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9"} Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.023689 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8d8895f6c-zszml" event={"ID":"7b751e62-8a05-413c-9f82-e9f28230e5ba","Type":"ContainerDied","Data":"d3aee10f5eb2213bed14729101e10990fa8a38feb2dfe34d59f150028a3cb3ae"} Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.023706 4750 scope.go:117] "RemoveContainer" containerID="c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.028005 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="dabcc140df6267bd6bfe6e96a507eb6f8fc953553e99e9c93846e177880aa4e8" exitCode=0 Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.028030 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"dabcc140df6267bd6bfe6e96a507eb6f8fc953553e99e9c93846e177880aa4e8"} Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.045703 4750 scope.go:117] "RemoveContainer" containerID="f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.057760 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.057791 4750 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.057800 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.057813 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm7dn\" (UniqueName: \"kubernetes.io/projected/7b751e62-8a05-413c-9f82-e9f28230e5ba-kube-api-access-vm7dn\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.057826 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-config\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.057838 4750 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.057848 4750 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b751e62-8a05-413c-9f82-e9f28230e5ba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.066735 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8d8895f6c-zszml"] Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.067262 4750 scope.go:117] "RemoveContainer" containerID="c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2" Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.067768 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2\": container with ID starting with c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2 not found: ID does not exist" containerID="c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.067815 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2"} err="failed to get container status \"c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2\": rpc error: code = NotFound desc = could not find container \"c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2\": container with ID starting with c354d8f2f57d9ac9743cc4e26bc6b9e3f99a80b563c28a39f8bf835593a34cc2 not found: ID does not exist" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.067846 4750 scope.go:117] "RemoveContainer" containerID="f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9" Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.068222 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9\": container with ID starting with f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9 not found: ID does not exist" containerID="f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.068251 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9"} err="failed to get container status \"f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9\": rpc error: code = NotFound desc = could not find container \"f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9\": container with ID starting with f50e4ae5919552806c39393795c8e3f0588d8c870ffd52952ea7b85a3e4511a9 not found: ID does not exist" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.068272 4750 scope.go:117] "RemoveContainer" containerID="a27f8518311deef574465704a4c93c21d7cb4e76fec24d95f01a5d7c9febd08d" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.069901 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8d8895f6c-zszml"] Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.482225 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.483416 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.483879 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.484030 4750 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.485858 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.487923 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.489810 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:30 crc kubenswrapper[4750]: E1008 18:33:30.489931 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:30 crc kubenswrapper[4750]: I1008 18:33:30.747360 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" path="/var/lib/kubelet/pods/7b751e62-8a05-413c-9f82-e9f28230e5ba/volumes" Oct 08 18:33:31 crc kubenswrapper[4750]: I1008 18:33:31.039040 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6"} Oct 08 18:33:35 crc kubenswrapper[4750]: E1008 18:33:35.481949 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:35 crc kubenswrapper[4750]: E1008 18:33:35.482806 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:35 crc kubenswrapper[4750]: E1008 18:33:35.483095 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 18:33:35 crc kubenswrapper[4750]: E1008 18:33:35.483145 4750 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:35 crc kubenswrapper[4750]: E1008 18:33:35.483234 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:35 crc kubenswrapper[4750]: E1008 18:33:35.485508 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:35 crc kubenswrapper[4750]: E1008 18:33:35.487794 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 18:33:35 crc kubenswrapper[4750]: E1008 18:33:35.487861 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-57vgx" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.097840 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-57vgx_e6709646-0141-474b-b73f-6f451e77f602/ovs-vswitchd/0.log" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.099161 4750 generic.go:334] "Generic (PLEG): container finished" podID="e6709646-0141-474b-b73f-6f451e77f602" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" exitCode=137 Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.099205 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-57vgx" event={"ID":"e6709646-0141-474b-b73f-6f451e77f602","Type":"ContainerDied","Data":"6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94"} Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.368804 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-57vgx_e6709646-0141-474b-b73f-6f451e77f602/ovs-vswitchd/0.log" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.369691 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.486670 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-log\") pod \"e6709646-0141-474b-b73f-6f451e77f602\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.486743 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvpf8\" (UniqueName: \"kubernetes.io/projected/e6709646-0141-474b-b73f-6f451e77f602-kube-api-access-zvpf8\") pod \"e6709646-0141-474b-b73f-6f451e77f602\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.486767 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-run\") pod \"e6709646-0141-474b-b73f-6f451e77f602\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.486840 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-etc-ovs\") pod \"e6709646-0141-474b-b73f-6f451e77f602\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.486892 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-lib\") pod \"e6709646-0141-474b-b73f-6f451e77f602\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.486967 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6709646-0141-474b-b73f-6f451e77f602-scripts\") pod \"e6709646-0141-474b-b73f-6f451e77f602\" (UID: \"e6709646-0141-474b-b73f-6f451e77f602\") " Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.487999 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-run" (OuterVolumeSpecName: "var-run") pod "e6709646-0141-474b-b73f-6f451e77f602" (UID: "e6709646-0141-474b-b73f-6f451e77f602"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.488118 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-log" (OuterVolumeSpecName: "var-log") pod "e6709646-0141-474b-b73f-6f451e77f602" (UID: "e6709646-0141-474b-b73f-6f451e77f602"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.488589 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6709646-0141-474b-b73f-6f451e77f602-scripts" (OuterVolumeSpecName: "scripts") pod "e6709646-0141-474b-b73f-6f451e77f602" (UID: "e6709646-0141-474b-b73f-6f451e77f602"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.488696 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-lib" (OuterVolumeSpecName: "var-lib") pod "e6709646-0141-474b-b73f-6f451e77f602" (UID: "e6709646-0141-474b-b73f-6f451e77f602"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.488800 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "e6709646-0141-474b-b73f-6f451e77f602" (UID: "e6709646-0141-474b-b73f-6f451e77f602"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.494776 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6709646-0141-474b-b73f-6f451e77f602-kube-api-access-zvpf8" (OuterVolumeSpecName: "kube-api-access-zvpf8") pod "e6709646-0141-474b-b73f-6f451e77f602" (UID: "e6709646-0141-474b-b73f-6f451e77f602"). InnerVolumeSpecName "kube-api-access-zvpf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.589783 4750 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-log\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.589822 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvpf8\" (UniqueName: \"kubernetes.io/projected/e6709646-0141-474b-b73f-6f451e77f602-kube-api-access-zvpf8\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.589836 4750 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.589847 4750 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.589857 4750 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e6709646-0141-474b-b73f-6f451e77f602-var-lib\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:36 crc kubenswrapper[4750]: I1008 18:33:36.589868 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6709646-0141-474b-b73f-6f451e77f602-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.113060 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-57vgx_e6709646-0141-474b-b73f-6f451e77f602/ovs-vswitchd/0.log" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.114332 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-57vgx" event={"ID":"e6709646-0141-474b-b73f-6f451e77f602","Type":"ContainerDied","Data":"f34e50a0f4606db14bbae944ca5a2cd7ca2937bb8ebda711d0061b299aca8ac6"} Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.114408 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-57vgx" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.114404 4750 scope.go:117] "RemoveContainer" containerID="6320fe2c14d57de0be701a1d75ea1e50bc53072f92cd6178aeb5a15ddadb3e94" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.124205 4750 generic.go:334] "Generic (PLEG): container finished" podID="be62333a-e650-4131-b0f1-c8c484539c7e" containerID="427cf714364cd21ce5dde409e29dd3aa65f33832204dfbf8ce289255b5e834c0" exitCode=137 Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.124257 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"427cf714364cd21ce5dde409e29dd3aa65f33832204dfbf8ce289255b5e834c0"} Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.169017 4750 scope.go:117] "RemoveContainer" containerID="afb689133a45aeb0be19ee70c244153e70840e64d2759842b263ebef6d8b33b2" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.174512 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-57vgx"] Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.183714 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-57vgx"] Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.195804 4750 scope.go:117] "RemoveContainer" containerID="3fb6163d60b7b44978ac474d176e064c6f7849a97dbf8bf4f4109430670dad8a" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.549489 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.705474 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") pod \"be62333a-e650-4131-b0f1-c8c484539c7e\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.705537 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-lock\") pod \"be62333a-e650-4131-b0f1-c8c484539c7e\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.705701 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-cache\") pod \"be62333a-e650-4131-b0f1-c8c484539c7e\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.705718 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"be62333a-e650-4131-b0f1-c8c484539c7e\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.705738 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkl8g\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-kube-api-access-kkl8g\") pod \"be62333a-e650-4131-b0f1-c8c484539c7e\" (UID: \"be62333a-e650-4131-b0f1-c8c484539c7e\") " Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.706168 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-cache" (OuterVolumeSpecName: "cache") pod "be62333a-e650-4131-b0f1-c8c484539c7e" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.706616 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-lock" (OuterVolumeSpecName: "lock") pod "be62333a-e650-4131-b0f1-c8c484539c7e" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.709914 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-kube-api-access-kkl8g" (OuterVolumeSpecName: "kube-api-access-kkl8g") pod "be62333a-e650-4131-b0f1-c8c484539c7e" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e"). InnerVolumeSpecName "kube-api-access-kkl8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.710436 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "be62333a-e650-4131-b0f1-c8c484539c7e" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.711232 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "be62333a-e650-4131-b0f1-c8c484539c7e" (UID: "be62333a-e650-4131-b0f1-c8c484539c7e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.807410 4750 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-cache\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.807470 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.807483 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkl8g\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-kube-api-access-kkl8g\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.807494 4750 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be62333a-e650-4131-b0f1-c8c484539c7e-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.807503 4750 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be62333a-e650-4131-b0f1-c8c484539c7e-lock\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.821222 4750 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.908630 4750 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:37 crc kubenswrapper[4750]: I1008 18:33:37.994464 4750 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7c3207de-78eb-41b2-a2be-163c9a3532af"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7c3207de-78eb-41b2-a2be-163c9a3532af] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7c3207de_78eb_41b2_a2be_163c9a3532af.slice" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.150245 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be62333a-e650-4131-b0f1-c8c484539c7e","Type":"ContainerDied","Data":"7c788d277c81526723ebafd7ec3acb36bcb8389596e6fee1747442ab8452d06a"} Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.150328 4750 scope.go:117] "RemoveContainer" containerID="427cf714364cd21ce5dde409e29dd3aa65f33832204dfbf8ce289255b5e834c0" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.150417 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.190686 4750 scope.go:117] "RemoveContainer" containerID="d2840db45a1d2b1e671a91bd14ec16767e2beb441096549c8e394b831dc54350" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.196210 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.211320 4750 scope.go:117] "RemoveContainer" containerID="3094942c3642c799d4db13b77a9964692bca3d841f52ece0573854c311f0f401" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.211375 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.244290 4750 scope.go:117] "RemoveContainer" containerID="3013b58e1d5ba309156a64f8003ce218400fb7a18be0cb766461f307776c2a76" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.272215 4750 scope.go:117] "RemoveContainer" containerID="1d96354096952d01c21e9fe7d3f0bcc4a2f331e8ac84cd83d2b48f90d58cc965" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.303331 4750 scope.go:117] "RemoveContainer" containerID="b8997ccfc894343c6e131268824155df3c9f2e6d480b351d3f97c9aa1cd88c3f" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.330247 4750 scope.go:117] "RemoveContainer" containerID="26c529adb950717fbb00bdeba5f0a5ebfb00cbe7017189d59c51fb3fb5a903e5" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.361399 4750 scope.go:117] "RemoveContainer" containerID="1a4ea5b2811f4515b0049bf0da3ae03b49c47b62fb6d2cddab227c8e93827963" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.389368 4750 scope.go:117] "RemoveContainer" containerID="c068b85bb1ced1a9426fda1897141d98518bc162a709c23d2054cd4d7cce8209" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.420509 4750 scope.go:117] "RemoveContainer" containerID="ec560a5ed96d75041eddd3e8cea8a27ed5d0b038772f918091c1f3453402ac4f" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.452538 4750 scope.go:117] "RemoveContainer" containerID="0daa177c260aa7c024a5646c1d578e111a41bb913177c6089099d8076dcc1b13" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.482447 4750 scope.go:117] "RemoveContainer" containerID="f20129167bf9bc7eee86dd0ea9d1df44c416086aba87c4cfefb4ac0bb12dca9c" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.511686 4750 scope.go:117] "RemoveContainer" containerID="52f4f1b7fcee564a61f7c59907f9f37ce855f6b9ccb07d9b5d4dea7ac7e8c754" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.533442 4750 scope.go:117] "RemoveContainer" containerID="db381222476cb51a6b70b126857ee5e5eff04de66df2645f6737aa6584b15305" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.565678 4750 scope.go:117] "RemoveContainer" containerID="d6cdce1fc3c750a32568c75f5f41fd036fc0c6d012614ac48dd3b89a28bddb19" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.744328 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" path="/var/lib/kubelet/pods/be62333a-e650-4131-b0f1-c8c484539c7e/volumes" Oct 08 18:33:38 crc kubenswrapper[4750]: I1008 18:33:38.746233 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6709646-0141-474b-b73f-6f451e77f602" path="/var/lib/kubelet/pods/e6709646-0141-474b-b73f-6f451e77f602/volumes" Oct 08 18:33:41 crc kubenswrapper[4750]: I1008 18:33:41.690342 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0c01-account-delete-rfw7g" Oct 08 18:33:41 crc kubenswrapper[4750]: I1008 18:33:41.786502 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder7a82-account-delete-rvqnc" Oct 08 18:33:41 crc kubenswrapper[4750]: I1008 18:33:41.865262 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85hr9\" (UniqueName: \"kubernetes.io/projected/af33a9f0-c575-46f6-a3cd-71391d454430-kube-api-access-85hr9\") pod \"af33a9f0-c575-46f6-a3cd-71391d454430\" (UID: \"af33a9f0-c575-46f6-a3cd-71391d454430\") " Oct 08 18:33:41 crc kubenswrapper[4750]: I1008 18:33:41.872805 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af33a9f0-c575-46f6-a3cd-71391d454430-kube-api-access-85hr9" (OuterVolumeSpecName: "kube-api-access-85hr9") pod "af33a9f0-c575-46f6-a3cd-71391d454430" (UID: "af33a9f0-c575-46f6-a3cd-71391d454430"). InnerVolumeSpecName "kube-api-access-85hr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:41 crc kubenswrapper[4750]: I1008 18:33:41.966426 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-658ll\" (UniqueName: \"kubernetes.io/projected/04d7f724-d53d-4412-bce7-cc6da81e45ac-kube-api-access-658ll\") pod \"04d7f724-d53d-4412-bce7-cc6da81e45ac\" (UID: \"04d7f724-d53d-4412-bce7-cc6da81e45ac\") " Oct 08 18:33:41 crc kubenswrapper[4750]: I1008 18:33:41.966746 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85hr9\" (UniqueName: \"kubernetes.io/projected/af33a9f0-c575-46f6-a3cd-71391d454430-kube-api-access-85hr9\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:41 crc kubenswrapper[4750]: I1008 18:33:41.969251 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d7f724-d53d-4412-bce7-cc6da81e45ac-kube-api-access-658ll" (OuterVolumeSpecName: "kube-api-access-658ll") pod "04d7f724-d53d-4412-bce7-cc6da81e45ac" (UID: "04d7f724-d53d-4412-bce7-cc6da81e45ac"). InnerVolumeSpecName "kube-api-access-658ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.067866 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-658ll\" (UniqueName: \"kubernetes.io/projected/04d7f724-d53d-4412-bce7-cc6da81e45ac-kube-api-access-658ll\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.198924 4750 generic.go:334] "Generic (PLEG): container finished" podID="04d7f724-d53d-4412-bce7-cc6da81e45ac" containerID="1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc" exitCode=137 Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.198976 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder7a82-account-delete-rvqnc" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.198965 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder7a82-account-delete-rvqnc" event={"ID":"04d7f724-d53d-4412-bce7-cc6da81e45ac","Type":"ContainerDied","Data":"1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc"} Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.199112 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder7a82-account-delete-rvqnc" event={"ID":"04d7f724-d53d-4412-bce7-cc6da81e45ac","Type":"ContainerDied","Data":"f1fe6211f671a872c18c03a7ae507f2373dd61e4bc88fa79ba64d6506d82fd0e"} Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.199141 4750 scope.go:117] "RemoveContainer" containerID="1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.201162 4750 generic.go:334] "Generic (PLEG): container finished" podID="af33a9f0-c575-46f6-a3cd-71391d454430" containerID="8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6" exitCode=137 Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.201224 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0c01-account-delete-rfw7g" event={"ID":"af33a9f0-c575-46f6-a3cd-71391d454430","Type":"ContainerDied","Data":"8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6"} Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.201254 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0c01-account-delete-rfw7g" event={"ID":"af33a9f0-c575-46f6-a3cd-71391d454430","Type":"ContainerDied","Data":"29cf139a21eea17f2e0c5a38a10b09b0a29ebbd85ea7b5109ef103d045beb659"} Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.201255 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0c01-account-delete-rfw7g" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.226497 4750 scope.go:117] "RemoveContainer" containerID="1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc" Oct 08 18:33:42 crc kubenswrapper[4750]: E1008 18:33:42.227010 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc\": container with ID starting with 1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc not found: ID does not exist" containerID="1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.227050 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc"} err="failed to get container status \"1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc\": rpc error: code = NotFound desc = could not find container \"1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc\": container with ID starting with 1f71946bf8bd0ed4f89225458c6b580b690387fad8980aaefcb0761c493ad1bc not found: ID does not exist" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.227073 4750 scope.go:117] "RemoveContainer" containerID="8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.229085 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder7a82-account-delete-rvqnc"] Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.237457 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder7a82-account-delete-rvqnc"] Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.241896 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0c01-account-delete-rfw7g"] Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.246844 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron0c01-account-delete-rfw7g"] Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.248401 4750 scope.go:117] "RemoveContainer" containerID="8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6" Oct 08 18:33:42 crc kubenswrapper[4750]: E1008 18:33:42.248878 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6\": container with ID starting with 8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6 not found: ID does not exist" containerID="8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.248915 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6"} err="failed to get container status \"8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6\": rpc error: code = NotFound desc = could not find container \"8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6\": container with ID starting with 8709beaed4cb4cd5b7b1692fb0a59cc21d5d400ea8ee35f3d252f23f18bde1d6 not found: ID does not exist" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.746102 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d7f724-d53d-4412-bce7-cc6da81e45ac" path="/var/lib/kubelet/pods/04d7f724-d53d-4412-bce7-cc6da81e45ac/volumes" Oct 08 18:33:42 crc kubenswrapper[4750]: I1008 18:33:42.747158 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af33a9f0-c575-46f6-a3cd-71391d454430" path="/var/lib/kubelet/pods/af33a9f0-c575-46f6-a3cd-71391d454430/volumes" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.125136 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement9f1c-account-delete-wx96m" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.132193 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5ee7-account-delete-nlrfw" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.152481 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapid8db-account-delete-7phjk" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.209752 4750 generic.go:334] "Generic (PLEG): container finished" podID="c751481b-5934-4262-84ff-106498a453e0" containerID="f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093" exitCode=137 Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.209822 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5ee7-account-delete-nlrfw" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.209827 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5ee7-account-delete-nlrfw" event={"ID":"c751481b-5934-4262-84ff-106498a453e0","Type":"ContainerDied","Data":"f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093"} Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.210113 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5ee7-account-delete-nlrfw" event={"ID":"c751481b-5934-4262-84ff-106498a453e0","Type":"ContainerDied","Data":"baf55c6163e79ccd097fdd653a613bee7d66e76de7dce406de8234909b311774"} Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.210191 4750 scope.go:117] "RemoveContainer" containerID="f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.212256 4750 generic.go:334] "Generic (PLEG): container finished" podID="736a9e30-a3de-4e9f-9de7-52015e55443e" containerID="dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f" exitCode=137 Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.212289 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapid8db-account-delete-7phjk" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.212329 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapid8db-account-delete-7phjk" event={"ID":"736a9e30-a3de-4e9f-9de7-52015e55443e","Type":"ContainerDied","Data":"dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f"} Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.212356 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapid8db-account-delete-7phjk" event={"ID":"736a9e30-a3de-4e9f-9de7-52015e55443e","Type":"ContainerDied","Data":"22627f58aaa5032d1b5fab055a1b6d9fe1303fe002bfeae8b3341420721db510"} Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.214151 4750 generic.go:334] "Generic (PLEG): container finished" podID="0c9b7a5e-9efd-45f8-86bf-730f55c077fd" containerID="1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca" exitCode=137 Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.214193 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement9f1c-account-delete-wx96m" event={"ID":"0c9b7a5e-9efd-45f8-86bf-730f55c077fd","Type":"ContainerDied","Data":"1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca"} Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.214254 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement9f1c-account-delete-wx96m" event={"ID":"0c9b7a5e-9efd-45f8-86bf-730f55c077fd","Type":"ContainerDied","Data":"48e5b20f57420651b9fd8b00fc147ab6612791bccf43f704cb157d1592d1f311"} Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.214379 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement9f1c-account-delete-wx96m" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.227622 4750 scope.go:117] "RemoveContainer" containerID="f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093" Oct 08 18:33:43 crc kubenswrapper[4750]: E1008 18:33:43.228347 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093\": container with ID starting with f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093 not found: ID does not exist" containerID="f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.228392 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093"} err="failed to get container status \"f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093\": rpc error: code = NotFound desc = could not find container \"f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093\": container with ID starting with f1f08d4665e4ec13fe674259dfd46b54f449fab774a681cc30e1e5ab9bf6d093 not found: ID does not exist" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.228438 4750 scope.go:117] "RemoveContainer" containerID="dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.243609 4750 scope.go:117] "RemoveContainer" containerID="dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f" Oct 08 18:33:43 crc kubenswrapper[4750]: E1008 18:33:43.244190 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f\": container with ID starting with dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f not found: ID does not exist" containerID="dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.244229 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f"} err="failed to get container status \"dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f\": rpc error: code = NotFound desc = could not find container \"dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f\": container with ID starting with dd3de844148f7944bc7fa8bfcc172b69d40730dfc54b697d389437312184ac0f not found: ID does not exist" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.244257 4750 scope.go:117] "RemoveContainer" containerID="1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.262236 4750 scope.go:117] "RemoveContainer" containerID="1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca" Oct 08 18:33:43 crc kubenswrapper[4750]: E1008 18:33:43.262601 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca\": container with ID starting with 1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca not found: ID does not exist" containerID="1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.262631 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca"} err="failed to get container status \"1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca\": rpc error: code = NotFound desc = could not find container \"1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca\": container with ID starting with 1919a59ce533152efa0e8de4ea23ce87a69e201b70926676af382a45945309ca not found: ID does not exist" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.281435 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59xbx\" (UniqueName: \"kubernetes.io/projected/0c9b7a5e-9efd-45f8-86bf-730f55c077fd-kube-api-access-59xbx\") pod \"0c9b7a5e-9efd-45f8-86bf-730f55c077fd\" (UID: \"0c9b7a5e-9efd-45f8-86bf-730f55c077fd\") " Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.281513 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grgss\" (UniqueName: \"kubernetes.io/projected/736a9e30-a3de-4e9f-9de7-52015e55443e-kube-api-access-grgss\") pod \"736a9e30-a3de-4e9f-9de7-52015e55443e\" (UID: \"736a9e30-a3de-4e9f-9de7-52015e55443e\") " Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.281630 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2wmq\" (UniqueName: \"kubernetes.io/projected/c751481b-5934-4262-84ff-106498a453e0-kube-api-access-f2wmq\") pod \"c751481b-5934-4262-84ff-106498a453e0\" (UID: \"c751481b-5934-4262-84ff-106498a453e0\") " Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.285958 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736a9e30-a3de-4e9f-9de7-52015e55443e-kube-api-access-grgss" (OuterVolumeSpecName: "kube-api-access-grgss") pod "736a9e30-a3de-4e9f-9de7-52015e55443e" (UID: "736a9e30-a3de-4e9f-9de7-52015e55443e"). InnerVolumeSpecName "kube-api-access-grgss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.286032 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c751481b-5934-4262-84ff-106498a453e0-kube-api-access-f2wmq" (OuterVolumeSpecName: "kube-api-access-f2wmq") pod "c751481b-5934-4262-84ff-106498a453e0" (UID: "c751481b-5934-4262-84ff-106498a453e0"). InnerVolumeSpecName "kube-api-access-f2wmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.286076 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9b7a5e-9efd-45f8-86bf-730f55c077fd-kube-api-access-59xbx" (OuterVolumeSpecName: "kube-api-access-59xbx") pod "0c9b7a5e-9efd-45f8-86bf-730f55c077fd" (UID: "0c9b7a5e-9efd-45f8-86bf-730f55c077fd"). InnerVolumeSpecName "kube-api-access-59xbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.383855 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2wmq\" (UniqueName: \"kubernetes.io/projected/c751481b-5934-4262-84ff-106498a453e0-kube-api-access-f2wmq\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.384284 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59xbx\" (UniqueName: \"kubernetes.io/projected/0c9b7a5e-9efd-45f8-86bf-730f55c077fd-kube-api-access-59xbx\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.384442 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grgss\" (UniqueName: \"kubernetes.io/projected/736a9e30-a3de-4e9f-9de7-52015e55443e-kube-api-access-grgss\") on node \"crc\" DevicePath \"\"" Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.542411 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican5ee7-account-delete-nlrfw"] Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.550924 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican5ee7-account-delete-nlrfw"] Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.559079 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement9f1c-account-delete-wx96m"] Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.565839 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement9f1c-account-delete-wx96m"] Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.569995 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapid8db-account-delete-7phjk"] Oct 08 18:33:43 crc kubenswrapper[4750]: I1008 18:33:43.573789 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapid8db-account-delete-7phjk"] Oct 08 18:33:44 crc kubenswrapper[4750]: I1008 18:33:44.752356 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9b7a5e-9efd-45f8-86bf-730f55c077fd" path="/var/lib/kubelet/pods/0c9b7a5e-9efd-45f8-86bf-730f55c077fd/volumes" Oct 08 18:33:44 crc kubenswrapper[4750]: I1008 18:33:44.753060 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736a9e30-a3de-4e9f-9de7-52015e55443e" path="/var/lib/kubelet/pods/736a9e30-a3de-4e9f-9de7-52015e55443e/volumes" Oct 08 18:33:44 crc kubenswrapper[4750]: I1008 18:33:44.753480 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c751481b-5934-4262-84ff-106498a453e0" path="/var/lib/kubelet/pods/c751481b-5934-4262-84ff-106498a453e0/volumes" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.079658 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ps28w"] Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080762 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-reaper" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080777 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-reaper" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080787 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerName="barbican-keystone-listener" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080795 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerName="barbican-keystone-listener" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080813 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerName="setup-container" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080820 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerName="setup-container" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080830 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7b2661-13bd-4ab4-a92d-7bc382cd257e" containerName="memcached" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080838 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7b2661-13bd-4ab4-a92d-7bc382cd257e" containerName="memcached" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080850 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba17973-3023-43ae-9b75-a8e1dc7f16cc" containerName="nova-scheduler-scheduler" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080857 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba17973-3023-43ae-9b75-a8e1dc7f16cc" containerName="nova-scheduler-scheduler" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080867 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerName="mysql-bootstrap" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080874 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerName="mysql-bootstrap" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080885 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080893 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080909 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" containerName="dnsmasq-dns" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080916 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" containerName="dnsmasq-dns" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080927 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc808a1a-9703-4009-8d81-e555a8e25929" containerName="ovn-controller" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080933 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc808a1a-9703-4009-8d81-e555a8e25929" containerName="ovn-controller" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080944 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080951 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-server" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080962 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9b7a5e-9efd-45f8-86bf-730f55c077fd" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080969 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9b7a5e-9efd-45f8-86bf-730f55c077fd" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080981 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerName="proxy-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.080988 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerName="proxy-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.080998 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081004 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081019 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af09729b-3284-4dcd-91a1-5763d28daaf5" containerName="nova-cell1-conductor-conductor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081027 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="af09729b-3284-4dcd-91a1-5763d28daaf5" containerName="nova-cell1-conductor-conductor" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081036 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="ovsdbserver-sb" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081045 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="ovsdbserver-sb" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081061 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081068 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081081 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b173b167-1fa4-45ec-98d0-16956f4b0b30" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081088 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b173b167-1fa4-45ec-98d0-16956f4b0b30" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081097 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2feb2439-d911-4585-a5e1-671abcfa357d" containerName="galera" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081104 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2feb2439-d911-4585-a5e1-671abcfa357d" containerName="galera" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081113 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerName="proxy-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081120 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerName="proxy-server" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081127 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038b3881-b266-4878-b395-87d7bf986446" containerName="nova-cell0-conductor-conductor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081133 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="038b3881-b266-4878-b395-87d7bf986446" containerName="nova-cell0-conductor-conductor" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081146 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="sg-core" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081153 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="sg-core" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081162 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3207de-78eb-41b2-a2be-163c9a3532af" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081168 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3207de-78eb-41b2-a2be-163c9a3532af" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081175 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081182 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-server" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081194 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" containerName="ovsdbserver-nb" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081201 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" containerName="ovsdbserver-nb" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081211 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="swift-recon-cron" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081217 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="swift-recon-cron" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081226 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f22ab58-6189-4321-b660-ed992f6fb70f" containerName="keystone-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081232 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f22ab58-6189-4321-b660-ed992f6fb70f" containerName="keystone-api" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081244 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081253 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081269 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerName="setup-container" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081278 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerName="setup-container" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081302 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-updater" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081310 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-updater" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081320 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081328 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081337 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerName="rabbitmq" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081344 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerName="rabbitmq" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081352 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-expirer" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081359 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-expirer" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081369 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="ceilometer-central-agent" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081376 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="ceilometer-central-agent" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081389 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerName="rabbitmq" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081396 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerName="rabbitmq" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081404 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736a9e30-a3de-4e9f-9de7-52015e55443e" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081410 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="736a9e30-a3de-4e9f-9de7-52015e55443e" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081420 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081426 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081441 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="ceilometer-notification-agent" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081448 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="ceilometer-notification-agent" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081458 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d7f724-d53d-4412-bce7-cc6da81e45ac" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081466 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d7f724-d53d-4412-bce7-cc6da81e45ac" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081475 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerName="placement-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081483 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerName="placement-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081497 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081504 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081512 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cbc20b-7898-4a47-99f6-80436897042c" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081519 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cbc20b-7898-4a47-99f6-80436897042c" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081529 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081536 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081544 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081568 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-server" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081575 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081583 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081594 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2feb2439-d911-4585-a5e1-671abcfa357d" containerName="mysql-bootstrap" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081601 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2feb2439-d911-4585-a5e1-671abcfa357d" containerName="mysql-bootstrap" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081610 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-updater" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081617 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-updater" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081627 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c751481b-5934-4262-84ff-106498a453e0" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081634 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c751481b-5934-4262-84ff-106498a453e0" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081648 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081655 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081667 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081674 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081684 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerName="cinder-scheduler" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081691 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerName="cinder-scheduler" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081699 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af33a9f0-c575-46f6-a3cd-71391d454430" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081706 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="af33a9f0-c575-46f6-a3cd-71391d454430" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081716 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081722 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081731 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="ovn-northd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081738 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="ovn-northd" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081749 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" containerName="init" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081755 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" containerName="init" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081767 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-metadata" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081774 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-metadata" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081786 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081793 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081807 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081814 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081824 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="rsync" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081831 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="rsync" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081842 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerName="glance-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081849 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerName="glance-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081859 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28550569-4c3c-48cf-a621-eddec0919b51" containerName="barbican-worker" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081866 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="28550569-4c3c-48cf-a621-eddec0919b51" containerName="barbican-worker" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081879 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerName="galera" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081887 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerName="galera" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081896 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" containerName="kube-state-metrics" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081904 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" containerName="kube-state-metrics" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081917 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081924 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-api" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081933 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerName="glance-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081940 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerName="glance-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081952 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081959 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081973 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081980 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.081992 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerName="placement-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.081998 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerName="placement-api" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082009 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082016 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082025 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082032 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082043 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="proxy-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082050 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="proxy-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082063 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server-init" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082071 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server-init" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082082 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082088 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082096 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082103 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082113 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28550569-4c3c-48cf-a621-eddec0919b51" containerName="barbican-worker-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082120 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="28550569-4c3c-48cf-a621-eddec0919b51" containerName="barbican-worker-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082131 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerName="probe" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082137 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerName="probe" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082151 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerName="barbican-keystone-listener-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082158 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerName="barbican-keystone-listener-log" Oct 08 18:33:57 crc kubenswrapper[4750]: E1008 18:33:57.082166 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082172 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082339 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="ovsdbserver-sb" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082353 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="590d851e-4648-48db-b385-aaa732f5c787" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082364 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082372 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082380 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="038b3881-b266-4878-b395-87d7bf986446" containerName="nova-cell0-conductor-conductor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082387 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="af33a9f0-c575-46f6-a3cd-71391d454430" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082401 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082412 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="736a9e30-a3de-4e9f-9de7-52015e55443e" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082423 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a52313-747b-40a7-a7e0-9e18f3c97c42" containerName="rabbitmq" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082432 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="swift-recon-cron" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082444 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b173b167-1fa4-45ec-98d0-16956f4b0b30" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082456 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8108eb-834c-44bd-9f39-70c348388ab6" containerName="rabbitmq" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082467 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerName="placement-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082474 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerName="glance-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082488 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="rsync" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082501 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerName="barbican-keystone-listener-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082513 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082523 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="proxy-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082533 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="28550569-4c3c-48cf-a621-eddec0919b51" containerName="barbican-worker" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082568 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082577 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="cedba103-c6cd-4e9a-9c7c-80d90aaedb3b" containerName="galera" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082587 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerName="proxy-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082597 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082610 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="sg-core" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082628 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082640 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d7f724-d53d-4412-bce7-cc6da81e45ac" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082651 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerName="probe" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082662 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082676 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25ebe44-c330-48f8-9df7-5f8517cd96bd" containerName="glance-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082689 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="60379ea9-0750-4de0-9d3b-13af080eea8f" containerName="cinder-scheduler" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082702 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-metadata" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082714 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e027e860-d0c0-4b1b-b02b-c374d92ae115" containerName="nova-metadata-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082725 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-updater" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082734 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="28550569-4c3c-48cf-a621-eddec0919b51" containerName="barbican-worker-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082745 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0647ee-68b0-4b8b-9bf0-066dc7a274d0" containerName="barbican-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082755 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4cbc20b-7898-4a47-99f6-80436897042c" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082764 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="dba17973-3023-43ae-9b75-a8e1dc7f16cc" containerName="nova-scheduler-scheduler" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082775 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f22ab58-6189-4321-b660-ed992f6fb70f" containerName="keystone-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082786 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-updater" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082799 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b751e62-8a05-413c-9f82-e9f28230e5ba" containerName="neutron-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082810 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-httpd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082817 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082824 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-auditor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082833 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082843 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aacea2e-e630-4280-8bed-b3b13b67f8ae" containerName="nova-api-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082853 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="ceilometer-notification-agent" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082863 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7385d5-3c78-4238-96be-78392eddee4b" containerName="ceilometer-central-agent" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082872 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc808a1a-9703-4009-8d81-e555a8e25929" containerName="ovn-controller" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082882 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b5ad2e-1ee1-4955-99c2-8daed456b21c" containerName="cinder-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082892 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082900 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1950dc-6caf-45f7-9b18-8c12db1b3f25" containerName="placement-api" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082911 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="230c02f8-af60-40d6-af19-adf730eec43f" containerName="ovn-northd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082917 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-replicator" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082926 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac899792-9b55-4ae1-b9f2-24d8bb4ebb2f" containerName="kube-state-metrics" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082935 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="235930b1-672c-4fc6-bbb4-78204c591aee" containerName="ovsdbserver-nb" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082946 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovs-vswitchd" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082954 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ecf0d73-0ca5-4124-93fb-348f8769c2e2" containerName="barbican-keystone-listener" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082966 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6709646-0141-474b-b73f-6f451e77f602" containerName="ovsdb-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082973 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3207de-78eb-41b2-a2be-163c9a3532af" containerName="openstack-network-exporter" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082980 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2feb2439-d911-4585-a5e1-671abcfa357d" containerName="galera" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082987 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c751481b-5934-4262-84ff-106498a453e0" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.082999 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7b2661-13bd-4ab4-a92d-7bc382cd257e" containerName="memcached" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083010 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083022 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="899027b7-067b-4ce1-a8f1-deaee627aa51" containerName="glance-log" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083032 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="account-reaper" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083041 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="af09729b-3284-4dcd-91a1-5763d28daaf5" containerName="nova-cell1-conductor-conductor" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083052 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="container-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083060 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be62333a-e650-4131-b0f1-c8c484539c7e" containerName="object-expirer" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083068 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c218b865-c7d1-4f46-ad6d-8e102b6af491" containerName="proxy-server" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083079 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa3814f-a0f4-4d53-9c08-44d7b45dd662" containerName="dnsmasq-dns" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.083088 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9b7a5e-9efd-45f8-86bf-730f55c077fd" containerName="mariadb-account-delete" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.084301 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.088676 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ps28w"] Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.117471 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-catalog-content\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.117524 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qfk\" (UniqueName: \"kubernetes.io/projected/e1462ba5-9e66-445d-a21a-dbae02c153cd-kube-api-access-85qfk\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.117564 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-utilities\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.219153 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-catalog-content\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.219226 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qfk\" (UniqueName: \"kubernetes.io/projected/e1462ba5-9e66-445d-a21a-dbae02c153cd-kube-api-access-85qfk\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.219254 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-utilities\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.219776 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-catalog-content\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.219881 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-utilities\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.238463 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qfk\" (UniqueName: \"kubernetes.io/projected/e1462ba5-9e66-445d-a21a-dbae02c153cd-kube-api-access-85qfk\") pod \"community-operators-ps28w\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.401749 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:33:57 crc kubenswrapper[4750]: I1008 18:33:57.868653 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ps28w"] Oct 08 18:33:58 crc kubenswrapper[4750]: I1008 18:33:58.441588 4750 generic.go:334] "Generic (PLEG): container finished" podID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerID="c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01" exitCode=0 Oct 08 18:33:58 crc kubenswrapper[4750]: I1008 18:33:58.441652 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps28w" event={"ID":"e1462ba5-9e66-445d-a21a-dbae02c153cd","Type":"ContainerDied","Data":"c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01"} Oct 08 18:33:58 crc kubenswrapper[4750]: I1008 18:33:58.441742 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps28w" event={"ID":"e1462ba5-9e66-445d-a21a-dbae02c153cd","Type":"ContainerStarted","Data":"b8c8f6beb256fc4a2f5496cb28a3ca94b77bf541c746241a752bc0bf821f4c9a"} Oct 08 18:33:58 crc kubenswrapper[4750]: I1008 18:33:58.445787 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 18:34:00 crc kubenswrapper[4750]: I1008 18:34:00.464318 4750 generic.go:334] "Generic (PLEG): container finished" podID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerID="63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc" exitCode=0 Oct 08 18:34:00 crc kubenswrapper[4750]: I1008 18:34:00.464395 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps28w" event={"ID":"e1462ba5-9e66-445d-a21a-dbae02c153cd","Type":"ContainerDied","Data":"63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc"} Oct 08 18:34:01 crc kubenswrapper[4750]: I1008 18:34:01.474660 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps28w" event={"ID":"e1462ba5-9e66-445d-a21a-dbae02c153cd","Type":"ContainerStarted","Data":"363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516"} Oct 08 18:34:01 crc kubenswrapper[4750]: I1008 18:34:01.494425 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ps28w" podStartSLOduration=1.905227523 podStartE2EDuration="4.494405817s" podCreationTimestamp="2025-10-08 18:33:57 +0000 UTC" firstStartedPulling="2025-10-08 18:33:58.443468716 +0000 UTC m=+1394.356439769" lastFinishedPulling="2025-10-08 18:34:01.03264704 +0000 UTC m=+1396.945618063" observedRunningTime="2025-10-08 18:34:01.490894914 +0000 UTC m=+1397.403865927" watchObservedRunningTime="2025-10-08 18:34:01.494405817 +0000 UTC m=+1397.407376830" Oct 08 18:34:07 crc kubenswrapper[4750]: I1008 18:34:07.402049 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:34:07 crc kubenswrapper[4750]: I1008 18:34:07.403813 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:34:07 crc kubenswrapper[4750]: I1008 18:34:07.466627 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:34:07 crc kubenswrapper[4750]: I1008 18:34:07.575837 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:34:07 crc kubenswrapper[4750]: I1008 18:34:07.706261 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ps28w"] Oct 08 18:34:09 crc kubenswrapper[4750]: I1008 18:34:09.546670 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ps28w" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerName="registry-server" containerID="cri-o://363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516" gracePeriod=2 Oct 08 18:34:09 crc kubenswrapper[4750]: I1008 18:34:09.999426 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.008425 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85qfk\" (UniqueName: \"kubernetes.io/projected/e1462ba5-9e66-445d-a21a-dbae02c153cd-kube-api-access-85qfk\") pod \"e1462ba5-9e66-445d-a21a-dbae02c153cd\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.008462 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-utilities\") pod \"e1462ba5-9e66-445d-a21a-dbae02c153cd\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.008480 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-catalog-content\") pod \"e1462ba5-9e66-445d-a21a-dbae02c153cd\" (UID: \"e1462ba5-9e66-445d-a21a-dbae02c153cd\") " Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.009279 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-utilities" (OuterVolumeSpecName: "utilities") pod "e1462ba5-9e66-445d-a21a-dbae02c153cd" (UID: "e1462ba5-9e66-445d-a21a-dbae02c153cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.014645 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1462ba5-9e66-445d-a21a-dbae02c153cd-kube-api-access-85qfk" (OuterVolumeSpecName: "kube-api-access-85qfk") pod "e1462ba5-9e66-445d-a21a-dbae02c153cd" (UID: "e1462ba5-9e66-445d-a21a-dbae02c153cd"). InnerVolumeSpecName "kube-api-access-85qfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.061256 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1462ba5-9e66-445d-a21a-dbae02c153cd" (UID: "e1462ba5-9e66-445d-a21a-dbae02c153cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.109904 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85qfk\" (UniqueName: \"kubernetes.io/projected/e1462ba5-9e66-445d-a21a-dbae02c153cd-kube-api-access-85qfk\") on node \"crc\" DevicePath \"\"" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.109951 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.109968 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1462ba5-9e66-445d-a21a-dbae02c153cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.556423 4750 generic.go:334] "Generic (PLEG): container finished" podID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerID="363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516" exitCode=0 Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.556462 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps28w" event={"ID":"e1462ba5-9e66-445d-a21a-dbae02c153cd","Type":"ContainerDied","Data":"363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516"} Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.556500 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps28w" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.556520 4750 scope.go:117] "RemoveContainer" containerID="363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.556510 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps28w" event={"ID":"e1462ba5-9e66-445d-a21a-dbae02c153cd","Type":"ContainerDied","Data":"b8c8f6beb256fc4a2f5496cb28a3ca94b77bf541c746241a752bc0bf821f4c9a"} Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.598703 4750 scope.go:117] "RemoveContainer" containerID="63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.603499 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ps28w"] Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.612162 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ps28w"] Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.637004 4750 scope.go:117] "RemoveContainer" containerID="c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.660411 4750 scope.go:117] "RemoveContainer" containerID="363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516" Oct 08 18:34:10 crc kubenswrapper[4750]: E1008 18:34:10.661017 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516\": container with ID starting with 363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516 not found: ID does not exist" containerID="363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.661061 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516"} err="failed to get container status \"363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516\": rpc error: code = NotFound desc = could not find container \"363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516\": container with ID starting with 363261d4f2a83dd248d5b25a73fec1ef22415bec149cd9daebe1a50c80d90516 not found: ID does not exist" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.661086 4750 scope.go:117] "RemoveContainer" containerID="63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc" Oct 08 18:34:10 crc kubenswrapper[4750]: E1008 18:34:10.661448 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc\": container with ID starting with 63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc not found: ID does not exist" containerID="63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.661613 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc"} err="failed to get container status \"63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc\": rpc error: code = NotFound desc = could not find container \"63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc\": container with ID starting with 63762b6c1180054b8a2febf74127001472bd08b1618b795d39e2effe967da6cc not found: ID does not exist" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.661735 4750 scope.go:117] "RemoveContainer" containerID="c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01" Oct 08 18:34:10 crc kubenswrapper[4750]: E1008 18:34:10.662192 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01\": container with ID starting with c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01 not found: ID does not exist" containerID="c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.662211 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01"} err="failed to get container status \"c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01\": rpc error: code = NotFound desc = could not find container \"c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01\": container with ID starting with c0466ce76d83d9e3191047f84717972a8af5481b81caef3376f635851b0cbb01 not found: ID does not exist" Oct 08 18:34:10 crc kubenswrapper[4750]: I1008 18:34:10.744141 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" path="/var/lib/kubelet/pods/e1462ba5-9e66-445d-a21a-dbae02c153cd/volumes" Oct 08 18:34:58 crc kubenswrapper[4750]: I1008 18:34:58.997195 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d5x5j"] Oct 08 18:34:59 crc kubenswrapper[4750]: E1008 18:34:58.998285 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerName="extract-content" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:58.998309 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerName="extract-content" Oct 08 18:34:59 crc kubenswrapper[4750]: E1008 18:34:58.998335 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerName="extract-utilities" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:58.998347 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerName="extract-utilities" Oct 08 18:34:59 crc kubenswrapper[4750]: E1008 18:34:58.998361 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerName="registry-server" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:58.998374 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerName="registry-server" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:58.998747 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1462ba5-9e66-445d-a21a-dbae02c153cd" containerName="registry-server" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.000212 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.007538 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-utilities\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.007623 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-catalog-content\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.007904 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x88sw\" (UniqueName: \"kubernetes.io/projected/a1d09659-6633-4368-a3e3-632907c3cea7-kube-api-access-x88sw\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.040071 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5x5j"] Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.072046 4750 scope.go:117] "RemoveContainer" containerID="a7d155d2c3ef20e11a19f3b07465d37d9143db5dbca744ebcde32b355f9f1e8b" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.111402 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-utilities\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.111458 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-catalog-content\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.111513 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x88sw\" (UniqueName: \"kubernetes.io/projected/a1d09659-6633-4368-a3e3-632907c3cea7-kube-api-access-x88sw\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.112328 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-utilities\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.112564 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-catalog-content\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.123071 4750 scope.go:117] "RemoveContainer" containerID="131e51f9cac5dc8a9c0d425746e3b9ee7a9d53950bcb5870bac43328508a3398" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.144463 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x88sw\" (UniqueName: \"kubernetes.io/projected/a1d09659-6633-4368-a3e3-632907c3cea7-kube-api-access-x88sw\") pod \"certified-operators-d5x5j\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.147571 4750 scope.go:117] "RemoveContainer" containerID="76705f4c8d0d963f55e04b2fb8a57f860f108ccd02fe957f52712f5bcd141748" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.188270 4750 scope.go:117] "RemoveContainer" containerID="44eb121f5a68d0a97c42240fc68cce4713a870335e4ddf0f8679f9aa32654ec7" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.221506 4750 scope.go:117] "RemoveContainer" containerID="076a806e0646bdccd9c8e5d177a4fb67634db132f346e97d11108bf680223bae" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.249789 4750 scope.go:117] "RemoveContainer" containerID="1498b7259b258a7e9a9da15f304cde6936a66b35578b17491b3ab09c9237c4c5" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.269133 4750 scope.go:117] "RemoveContainer" containerID="48a37bdd5977b748b1a035850954139e3743138dd89b2a624758f9b8e7aaa270" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.290532 4750 scope.go:117] "RemoveContainer" containerID="b1122e7597d4ce226d03941ae4d26bd429a007b8def572c79b06e0ed78ba9d3d" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.315314 4750 scope.go:117] "RemoveContainer" containerID="207cabb8cb0acf23e4bf4e62948733f47bedb3c82673a966db5aa8e4ed65d14b" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.337301 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.349011 4750 scope.go:117] "RemoveContainer" containerID="c099682680f99d7d836a4647a357e3b0557699173c713817a786fb3c6c7a1d59" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.382472 4750 scope.go:117] "RemoveContainer" containerID="ded8dda6ca3cc7f7f9717d2c3a2bb3364d14652de92ec33f826a73f6b5aead1d" Oct 08 18:34:59 crc kubenswrapper[4750]: I1008 18:34:59.832978 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5x5j"] Oct 08 18:35:00 crc kubenswrapper[4750]: I1008 18:35:00.133114 4750 generic.go:334] "Generic (PLEG): container finished" podID="a1d09659-6633-4368-a3e3-632907c3cea7" containerID="e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48" exitCode=0 Oct 08 18:35:00 crc kubenswrapper[4750]: I1008 18:35:00.133192 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5x5j" event={"ID":"a1d09659-6633-4368-a3e3-632907c3cea7","Type":"ContainerDied","Data":"e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48"} Oct 08 18:35:00 crc kubenswrapper[4750]: I1008 18:35:00.134874 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5x5j" event={"ID":"a1d09659-6633-4368-a3e3-632907c3cea7","Type":"ContainerStarted","Data":"8953dc22aeb90588085845485e66dcb651285ef0547ebc2ad33eb020642a2078"} Oct 08 18:35:02 crc kubenswrapper[4750]: I1008 18:35:02.155893 4750 generic.go:334] "Generic (PLEG): container finished" podID="a1d09659-6633-4368-a3e3-632907c3cea7" containerID="930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5" exitCode=0 Oct 08 18:35:02 crc kubenswrapper[4750]: I1008 18:35:02.155987 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5x5j" event={"ID":"a1d09659-6633-4368-a3e3-632907c3cea7","Type":"ContainerDied","Data":"930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5"} Oct 08 18:35:03 crc kubenswrapper[4750]: I1008 18:35:03.172245 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5x5j" event={"ID":"a1d09659-6633-4368-a3e3-632907c3cea7","Type":"ContainerStarted","Data":"9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5"} Oct 08 18:35:03 crc kubenswrapper[4750]: I1008 18:35:03.201936 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d5x5j" podStartSLOduration=2.632881288 podStartE2EDuration="5.201902689s" podCreationTimestamp="2025-10-08 18:34:58 +0000 UTC" firstStartedPulling="2025-10-08 18:35:00.134299076 +0000 UTC m=+1456.047270129" lastFinishedPulling="2025-10-08 18:35:02.703320517 +0000 UTC m=+1458.616291530" observedRunningTime="2025-10-08 18:35:03.195604847 +0000 UTC m=+1459.108575920" watchObservedRunningTime="2025-10-08 18:35:03.201902689 +0000 UTC m=+1459.114873772" Oct 08 18:35:09 crc kubenswrapper[4750]: I1008 18:35:09.337525 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:35:09 crc kubenswrapper[4750]: I1008 18:35:09.338286 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:35:09 crc kubenswrapper[4750]: I1008 18:35:09.419000 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:35:10 crc kubenswrapper[4750]: I1008 18:35:10.295021 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:35:10 crc kubenswrapper[4750]: I1008 18:35:10.339463 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5x5j"] Oct 08 18:35:12 crc kubenswrapper[4750]: I1008 18:35:12.260603 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d5x5j" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" containerName="registry-server" containerID="cri-o://9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5" gracePeriod=2 Oct 08 18:35:12 crc kubenswrapper[4750]: I1008 18:35:12.727337 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:35:12 crc kubenswrapper[4750]: I1008 18:35:12.916792 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x88sw\" (UniqueName: \"kubernetes.io/projected/a1d09659-6633-4368-a3e3-632907c3cea7-kube-api-access-x88sw\") pod \"a1d09659-6633-4368-a3e3-632907c3cea7\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " Oct 08 18:35:12 crc kubenswrapper[4750]: I1008 18:35:12.916888 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-utilities\") pod \"a1d09659-6633-4368-a3e3-632907c3cea7\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " Oct 08 18:35:12 crc kubenswrapper[4750]: I1008 18:35:12.916975 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-catalog-content\") pod \"a1d09659-6633-4368-a3e3-632907c3cea7\" (UID: \"a1d09659-6633-4368-a3e3-632907c3cea7\") " Oct 08 18:35:12 crc kubenswrapper[4750]: I1008 18:35:12.917810 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-utilities" (OuterVolumeSpecName: "utilities") pod "a1d09659-6633-4368-a3e3-632907c3cea7" (UID: "a1d09659-6633-4368-a3e3-632907c3cea7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:35:12 crc kubenswrapper[4750]: I1008 18:35:12.924637 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d09659-6633-4368-a3e3-632907c3cea7-kube-api-access-x88sw" (OuterVolumeSpecName: "kube-api-access-x88sw") pod "a1d09659-6633-4368-a3e3-632907c3cea7" (UID: "a1d09659-6633-4368-a3e3-632907c3cea7"). InnerVolumeSpecName "kube-api-access-x88sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:35:12 crc kubenswrapper[4750]: I1008 18:35:12.960204 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1d09659-6633-4368-a3e3-632907c3cea7" (UID: "a1d09659-6633-4368-a3e3-632907c3cea7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.018023 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.018052 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x88sw\" (UniqueName: \"kubernetes.io/projected/a1d09659-6633-4368-a3e3-632907c3cea7-kube-api-access-x88sw\") on node \"crc\" DevicePath \"\"" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.018064 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1d09659-6633-4368-a3e3-632907c3cea7-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.274584 4750 generic.go:334] "Generic (PLEG): container finished" podID="a1d09659-6633-4368-a3e3-632907c3cea7" containerID="9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5" exitCode=0 Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.274628 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5x5j" event={"ID":"a1d09659-6633-4368-a3e3-632907c3cea7","Type":"ContainerDied","Data":"9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5"} Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.274660 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5x5j" event={"ID":"a1d09659-6633-4368-a3e3-632907c3cea7","Type":"ContainerDied","Data":"8953dc22aeb90588085845485e66dcb651285ef0547ebc2ad33eb020642a2078"} Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.274673 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5x5j" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.274695 4750 scope.go:117] "RemoveContainer" containerID="9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.306787 4750 scope.go:117] "RemoveContainer" containerID="930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.312131 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5x5j"] Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.320662 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d5x5j"] Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.334505 4750 scope.go:117] "RemoveContainer" containerID="e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.362861 4750 scope.go:117] "RemoveContainer" containerID="9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5" Oct 08 18:35:13 crc kubenswrapper[4750]: E1008 18:35:13.363393 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5\": container with ID starting with 9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5 not found: ID does not exist" containerID="9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.363422 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5"} err="failed to get container status \"9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5\": rpc error: code = NotFound desc = could not find container \"9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5\": container with ID starting with 9b6d3baade4d697986ccb54ab13de425a1d7618a889c5fe089c48e3e0f9cdae5 not found: ID does not exist" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.363443 4750 scope.go:117] "RemoveContainer" containerID="930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5" Oct 08 18:35:13 crc kubenswrapper[4750]: E1008 18:35:13.363672 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5\": container with ID starting with 930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5 not found: ID does not exist" containerID="930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.363692 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5"} err="failed to get container status \"930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5\": rpc error: code = NotFound desc = could not find container \"930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5\": container with ID starting with 930f4ee37ca9e2c7230b0deb801c2b053ea21ac90ad8db64b65f72565e7063c5 not found: ID does not exist" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.363705 4750 scope.go:117] "RemoveContainer" containerID="e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48" Oct 08 18:35:13 crc kubenswrapper[4750]: E1008 18:35:13.363947 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48\": container with ID starting with e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48 not found: ID does not exist" containerID="e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48" Oct 08 18:35:13 crc kubenswrapper[4750]: I1008 18:35:13.363970 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48"} err="failed to get container status \"e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48\": rpc error: code = NotFound desc = could not find container \"e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48\": container with ID starting with e956f060ab76b51ebfc804ad23904c470d6bfd900f5007d9d734a7f7b93eee48 not found: ID does not exist" Oct 08 18:35:14 crc kubenswrapper[4750]: I1008 18:35:14.750626 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" path="/var/lib/kubelet/pods/a1d09659-6633-4368-a3e3-632907c3cea7/volumes" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.706716 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.707256 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.710251 4750 scope.go:117] "RemoveContainer" containerID="ca4029d61ca8c3a69a52b031c6cf5be834790b73d98c1a84ffbda4221e421b3e" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.737782 4750 scope.go:117] "RemoveContainer" containerID="cf4c2892fadc655bf219082514f3fe07d61bd9425e6e92254e3b9651f1597404" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.756613 4750 scope.go:117] "RemoveContainer" containerID="c8bcb751be31666c14a48d5c24330a20b0c43fc84ee99554962325b0311ad93b" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.814440 4750 scope.go:117] "RemoveContainer" containerID="1a4f284766309de7c3571dfcdbf004e019a0d6c456fedfed07930a33157eef86" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.830365 4750 scope.go:117] "RemoveContainer" containerID="6d2d81f330918bc137f967d02746373a6aaad1b372fab751954c626da8c09f77" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.861631 4750 scope.go:117] "RemoveContainer" containerID="9b81f0da23f03118e4fbf3e11325fa3d29005ce8f6200d3a78e18a1d7bbc7cf8" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.906946 4750 scope.go:117] "RemoveContainer" containerID="6ba31a99af2f3e04fa9f91b5c6229c6f4e9c35bdad3860c3aa4d9c28c11045a0" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.948405 4750 scope.go:117] "RemoveContainer" containerID="69359e54f0c54f8a5c8865533bf2f4afa5a118f2e82303c1851cef48e5f58c91" Oct 08 18:35:59 crc kubenswrapper[4750]: I1008 18:35:59.994660 4750 scope.go:117] "RemoveContainer" containerID="30b8df0973d91b6cf5ac25e5502db9dbe44d6285631679c52bbd4ed5d43e219d" Oct 08 18:36:00 crc kubenswrapper[4750]: I1008 18:36:00.036830 4750 scope.go:117] "RemoveContainer" containerID="0a43919de404773932dbdd0d928ccfb9e008e4049e2135f7c1de0c5a8fb787e1" Oct 08 18:36:00 crc kubenswrapper[4750]: I1008 18:36:00.063703 4750 scope.go:117] "RemoveContainer" containerID="bc3334f37fde9a4e7df22df7f3d4e9f1274959c985f56c2a8e8a61845e708099" Oct 08 18:36:00 crc kubenswrapper[4750]: I1008 18:36:00.078067 4750 scope.go:117] "RemoveContainer" containerID="a053f897eeff89050ab907acb6fc0ae927f0d046b3f849e913ee782c384e7d81" Oct 08 18:36:00 crc kubenswrapper[4750]: I1008 18:36:00.093688 4750 scope.go:117] "RemoveContainer" containerID="f096f491816489e7c0702cbcc047978c74c8d6f41e3e5ccd8ce038ae33450c80" Oct 08 18:36:00 crc kubenswrapper[4750]: I1008 18:36:00.116952 4750 scope.go:117] "RemoveContainer" containerID="ca1124b65f9b238ccebf542a0599c167610983e694ed8e3367e1a9718e1ed196" Oct 08 18:36:00 crc kubenswrapper[4750]: I1008 18:36:00.146589 4750 scope.go:117] "RemoveContainer" containerID="ff70ec89fcd284727c94a8b206245c57250c99089e493a89641fa9ae7e6a9a5a" Oct 08 18:36:00 crc kubenswrapper[4750]: I1008 18:36:00.162861 4750 scope.go:117] "RemoveContainer" containerID="e0eb306813cd83380792baf93d28642815dbbaa7ce0d625844aac67c665b8844" Oct 08 18:36:29 crc kubenswrapper[4750]: I1008 18:36:29.707211 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:36:29 crc kubenswrapper[4750]: I1008 18:36:29.708857 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:36:59 crc kubenswrapper[4750]: I1008 18:36:59.707617 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:36:59 crc kubenswrapper[4750]: I1008 18:36:59.708230 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:36:59 crc kubenswrapper[4750]: I1008 18:36:59.708290 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:36:59 crc kubenswrapper[4750]: I1008 18:36:59.709077 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:36:59 crc kubenswrapper[4750]: I1008 18:36:59.709140 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" gracePeriod=600 Oct 08 18:36:59 crc kubenswrapper[4750]: E1008 18:36:59.827109 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.250901 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" exitCode=0 Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.250946 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6"} Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.250979 4750 scope.go:117] "RemoveContainer" containerID="dabcc140df6267bd6bfe6e96a507eb6f8fc953553e99e9c93846e177880aa4e8" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.251482 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:37:00 crc kubenswrapper[4750]: E1008 18:37:00.251806 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.394007 4750 scope.go:117] "RemoveContainer" containerID="8e086cb3ed64079b1841a6cb5b0743cfbcec3a0c2ccd2a64235e1bcc59f94726" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.432590 4750 scope.go:117] "RemoveContainer" containerID="653d2c90232b23be484b8fc7b55378350753180183fba74f1dbd9d6ac1dde268" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.453697 4750 scope.go:117] "RemoveContainer" containerID="2d2ea6e9b74498814dcf6fb556ced530c433e9440f72f77d7890312774245a65" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.477886 4750 scope.go:117] "RemoveContainer" containerID="1a151318fef849fd5d62d1d54d979f4d4b38f1111627d51da48a87b928387362" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.508315 4750 scope.go:117] "RemoveContainer" containerID="4a872e638239e5f6665265383ac7fe690b7080fde4880ee3d5dcdc9e1bf48f36" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.528661 4750 scope.go:117] "RemoveContainer" containerID="502688bbe286affcf30f98fc226d9553cd6032ff06bd5cff82113e535ec1397d" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.563582 4750 scope.go:117] "RemoveContainer" containerID="615f108be73c8808e6f727e994a17ccb8fea35ef950e043806543c45a2c404b7" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.598001 4750 scope.go:117] "RemoveContainer" containerID="b88fc77d21b840724590332b8c3c0a46a81b3b88285cff5ea88ef4c54f55690c" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.612783 4750 scope.go:117] "RemoveContainer" containerID="e8cbb35efffa0dfc5317921463dc9d7e30007f7e471dd1e64c7dfbb694f8a5d9" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.632923 4750 scope.go:117] "RemoveContainer" containerID="0cbdb9ccec21a587535b2790946f321f9a1473089f9904fa3e1601f209baa312" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.646461 4750 scope.go:117] "RemoveContainer" containerID="7ea0afff5bdb9f582859dc596ceccf1118ad346f839baedefa4367e2be82fa4a" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.670798 4750 scope.go:117] "RemoveContainer" containerID="3a4d37b7d84777e50274ee7ecd6a495b1deb334ff5445b8593c34b3c4c6874f5" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.686387 4750 scope.go:117] "RemoveContainer" containerID="ea10ae3da743eb1f3f0c357c0ff737b1ceb0a0e3e2a2d97d9e6932ea21abcefc" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.721811 4750 scope.go:117] "RemoveContainer" containerID="f8c62ba9b4feac7a06f75a6ce1e507df508740e668c07ed3f788844322a15de7" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.743458 4750 scope.go:117] "RemoveContainer" containerID="e3598e5f4e24bd3d7dfa1cc40e2b04f8ead19e68ba96e9840955fd5275a178f5" Oct 08 18:37:00 crc kubenswrapper[4750]: I1008 18:37:00.759572 4750 scope.go:117] "RemoveContainer" containerID="3dac308349d8e3040066022cf2273a5c40cf9dfc274269fe30212fe125c4aad8" Oct 08 18:37:13 crc kubenswrapper[4750]: I1008 18:37:13.734330 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:37:13 crc kubenswrapper[4750]: E1008 18:37:13.735400 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:37:27 crc kubenswrapper[4750]: I1008 18:37:27.734520 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:37:27 crc kubenswrapper[4750]: E1008 18:37:27.735187 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:37:38 crc kubenswrapper[4750]: I1008 18:37:38.733964 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:37:38 crc kubenswrapper[4750]: E1008 18:37:38.734855 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:37:51 crc kubenswrapper[4750]: I1008 18:37:51.734700 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:37:51 crc kubenswrapper[4750]: E1008 18:37:51.735428 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:38:00 crc kubenswrapper[4750]: I1008 18:38:00.910775 4750 scope.go:117] "RemoveContainer" containerID="b4f13f7200451000f175c62b160ddb55de7cf3bd3d9beaaa3673047fa05bb9c7" Oct 08 18:38:00 crc kubenswrapper[4750]: I1008 18:38:00.928200 4750 scope.go:117] "RemoveContainer" containerID="0ba5b8a05e8f46a8ac74a1a7956f2c4cec61b825654ab0658e1c4de2d90cdd8b" Oct 08 18:38:00 crc kubenswrapper[4750]: I1008 18:38:00.961653 4750 scope.go:117] "RemoveContainer" containerID="2c9770274063e17c68b83e18ee0724852e8b71a48e0e550558f209c61ad073ee" Oct 08 18:38:00 crc kubenswrapper[4750]: I1008 18:38:00.978599 4750 scope.go:117] "RemoveContainer" containerID="1b2e074006d40f2b6f4feb94e1154de720671b362e1e247d9fcda194a8920fbe" Oct 08 18:38:06 crc kubenswrapper[4750]: I1008 18:38:06.733769 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:38:06 crc kubenswrapper[4750]: E1008 18:38:06.734838 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:38:19 crc kubenswrapper[4750]: I1008 18:38:19.734575 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:38:19 crc kubenswrapper[4750]: E1008 18:38:19.735462 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:38:34 crc kubenswrapper[4750]: I1008 18:38:34.751846 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:38:34 crc kubenswrapper[4750]: E1008 18:38:34.754053 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.374633 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t2j2l"] Oct 08 18:38:45 crc kubenswrapper[4750]: E1008 18:38:45.376595 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" containerName="extract-content" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.376683 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" containerName="extract-content" Oct 08 18:38:45 crc kubenswrapper[4750]: E1008 18:38:45.376753 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" containerName="extract-utilities" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.376808 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" containerName="extract-utilities" Oct 08 18:38:45 crc kubenswrapper[4750]: E1008 18:38:45.376872 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" containerName="registry-server" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.376930 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" containerName="registry-server" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.377111 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d09659-6633-4368-a3e3-632907c3cea7" containerName="registry-server" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.378370 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.387412 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2j2l"] Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.456447 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-catalog-content\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.456532 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wp4h\" (UniqueName: \"kubernetes.io/projected/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-kube-api-access-4wp4h\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.456650 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-utilities\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.557223 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-catalog-content\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.557265 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wp4h\" (UniqueName: \"kubernetes.io/projected/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-kube-api-access-4wp4h\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.557318 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-utilities\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.557946 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-utilities\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.558014 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-catalog-content\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.579199 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wp4h\" (UniqueName: \"kubernetes.io/projected/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-kube-api-access-4wp4h\") pod \"redhat-operators-t2j2l\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:45 crc kubenswrapper[4750]: I1008 18:38:45.697328 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:46 crc kubenswrapper[4750]: I1008 18:38:46.135714 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2j2l"] Oct 08 18:38:47 crc kubenswrapper[4750]: I1008 18:38:47.102715 4750 generic.go:334] "Generic (PLEG): container finished" podID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerID="4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1" exitCode=0 Oct 08 18:38:47 crc kubenswrapper[4750]: I1008 18:38:47.102827 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2j2l" event={"ID":"9cdb0877-bb91-4ea7-b1eb-79dc06b53592","Type":"ContainerDied","Data":"4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1"} Oct 08 18:38:47 crc kubenswrapper[4750]: I1008 18:38:47.103016 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2j2l" event={"ID":"9cdb0877-bb91-4ea7-b1eb-79dc06b53592","Type":"ContainerStarted","Data":"b5e73ae3767a431674ce6fa5836c1eb86e3847a60db39e86dafe2fac16cfe16f"} Oct 08 18:38:47 crc kubenswrapper[4750]: I1008 18:38:47.733911 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:38:47 crc kubenswrapper[4750]: E1008 18:38:47.734117 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:38:47 crc kubenswrapper[4750]: I1008 18:38:47.965206 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27vpr"] Oct 08 18:38:47 crc kubenswrapper[4750]: I1008 18:38:47.969990 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:47 crc kubenswrapper[4750]: I1008 18:38:47.986128 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vpr"] Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.093803 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-utilities\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.094256 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-catalog-content\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.094423 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7jvh\" (UniqueName: \"kubernetes.io/projected/d60974c8-e497-4ef5-8293-81ae0f64217d-kube-api-access-p7jvh\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.111904 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2j2l" event={"ID":"9cdb0877-bb91-4ea7-b1eb-79dc06b53592","Type":"ContainerStarted","Data":"4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d"} Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.195889 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7jvh\" (UniqueName: \"kubernetes.io/projected/d60974c8-e497-4ef5-8293-81ae0f64217d-kube-api-access-p7jvh\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.196307 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-utilities\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.196445 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-catalog-content\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.196836 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-utilities\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.196886 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-catalog-content\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.217103 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7jvh\" (UniqueName: \"kubernetes.io/projected/d60974c8-e497-4ef5-8293-81ae0f64217d-kube-api-access-p7jvh\") pod \"redhat-marketplace-27vpr\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.295566 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:48 crc kubenswrapper[4750]: I1008 18:38:48.748830 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vpr"] Oct 08 18:38:48 crc kubenswrapper[4750]: W1008 18:38:48.749779 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd60974c8_e497_4ef5_8293_81ae0f64217d.slice/crio-b56203eb6f07c97c32d8fcf8deec51b2ecf78bd9c47657ecc5816e1a38342cb2 WatchSource:0}: Error finding container b56203eb6f07c97c32d8fcf8deec51b2ecf78bd9c47657ecc5816e1a38342cb2: Status 404 returned error can't find the container with id b56203eb6f07c97c32d8fcf8deec51b2ecf78bd9c47657ecc5816e1a38342cb2 Oct 08 18:38:49 crc kubenswrapper[4750]: I1008 18:38:49.119923 4750 generic.go:334] "Generic (PLEG): container finished" podID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerID="4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d" exitCode=0 Oct 08 18:38:49 crc kubenswrapper[4750]: I1008 18:38:49.119985 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2j2l" event={"ID":"9cdb0877-bb91-4ea7-b1eb-79dc06b53592","Type":"ContainerDied","Data":"4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d"} Oct 08 18:38:49 crc kubenswrapper[4750]: I1008 18:38:49.121790 4750 generic.go:334] "Generic (PLEG): container finished" podID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerID="07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2" exitCode=0 Oct 08 18:38:49 crc kubenswrapper[4750]: I1008 18:38:49.121830 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vpr" event={"ID":"d60974c8-e497-4ef5-8293-81ae0f64217d","Type":"ContainerDied","Data":"07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2"} Oct 08 18:38:49 crc kubenswrapper[4750]: I1008 18:38:49.121852 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vpr" event={"ID":"d60974c8-e497-4ef5-8293-81ae0f64217d","Type":"ContainerStarted","Data":"b56203eb6f07c97c32d8fcf8deec51b2ecf78bd9c47657ecc5816e1a38342cb2"} Oct 08 18:38:50 crc kubenswrapper[4750]: I1008 18:38:50.130150 4750 generic.go:334] "Generic (PLEG): container finished" podID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerID="7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0" exitCode=0 Oct 08 18:38:50 crc kubenswrapper[4750]: I1008 18:38:50.130221 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vpr" event={"ID":"d60974c8-e497-4ef5-8293-81ae0f64217d","Type":"ContainerDied","Data":"7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0"} Oct 08 18:38:50 crc kubenswrapper[4750]: I1008 18:38:50.134858 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2j2l" event={"ID":"9cdb0877-bb91-4ea7-b1eb-79dc06b53592","Type":"ContainerStarted","Data":"1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5"} Oct 08 18:38:50 crc kubenswrapper[4750]: I1008 18:38:50.169924 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t2j2l" podStartSLOduration=2.7377695429999998 podStartE2EDuration="5.169905096s" podCreationTimestamp="2025-10-08 18:38:45 +0000 UTC" firstStartedPulling="2025-10-08 18:38:47.104825537 +0000 UTC m=+1683.017796550" lastFinishedPulling="2025-10-08 18:38:49.53696108 +0000 UTC m=+1685.449932103" observedRunningTime="2025-10-08 18:38:50.163066729 +0000 UTC m=+1686.076037772" watchObservedRunningTime="2025-10-08 18:38:50.169905096 +0000 UTC m=+1686.082876109" Oct 08 18:38:51 crc kubenswrapper[4750]: I1008 18:38:51.143018 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vpr" event={"ID":"d60974c8-e497-4ef5-8293-81ae0f64217d","Type":"ContainerStarted","Data":"951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323"} Oct 08 18:38:51 crc kubenswrapper[4750]: I1008 18:38:51.167605 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27vpr" podStartSLOduration=2.598761691 podStartE2EDuration="4.167580957s" podCreationTimestamp="2025-10-08 18:38:47 +0000 UTC" firstStartedPulling="2025-10-08 18:38:49.123699743 +0000 UTC m=+1685.036670796" lastFinishedPulling="2025-10-08 18:38:50.692519049 +0000 UTC m=+1686.605490062" observedRunningTime="2025-10-08 18:38:51.160320111 +0000 UTC m=+1687.073291124" watchObservedRunningTime="2025-10-08 18:38:51.167580957 +0000 UTC m=+1687.080552010" Oct 08 18:38:55 crc kubenswrapper[4750]: I1008 18:38:55.698025 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:55 crc kubenswrapper[4750]: I1008 18:38:55.698349 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:55 crc kubenswrapper[4750]: I1008 18:38:55.751507 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:56 crc kubenswrapper[4750]: I1008 18:38:56.238263 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:38:56 crc kubenswrapper[4750]: I1008 18:38:56.355413 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2j2l"] Oct 08 18:38:58 crc kubenswrapper[4750]: I1008 18:38:58.214083 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t2j2l" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerName="registry-server" containerID="cri-o://1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5" gracePeriod=2 Oct 08 18:38:58 crc kubenswrapper[4750]: I1008 18:38:58.296986 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:58 crc kubenswrapper[4750]: I1008 18:38:58.297059 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:58 crc kubenswrapper[4750]: I1008 18:38:58.364226 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:59 crc kubenswrapper[4750]: I1008 18:38:59.310592 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:38:59 crc kubenswrapper[4750]: I1008 18:38:59.761483 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vpr"] Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.072357 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.084505 4750 scope.go:117] "RemoveContainer" containerID="5eedb186a864de7ffc8ffb0c7d12aa7cbfd51fbb9c7ce15f42d2611d1dc2df3a" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.112334 4750 scope.go:117] "RemoveContainer" containerID="ef0cea034999dc054ed498ebdf790b29ef4af6c72ed138ea7f570a475564630d" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.132030 4750 scope.go:117] "RemoveContainer" containerID="e214212500f39b2cfd5d5ceb30ff46551ecb5483a36e38847a82604576ad8136" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.177893 4750 scope.go:117] "RemoveContainer" containerID="19ff8cd5ce4ea2ca5a48e87f8b09c6d1da6004603a55895cb9163299bb16a295" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.193211 4750 scope.go:117] "RemoveContainer" containerID="cf0d886f93ff577bb445d8aa80c9cd9e710f2e6e45c5f75d8333a920295edfa7" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.197979 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-utilities\") pod \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.198062 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-catalog-content\") pod \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.198142 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wp4h\" (UniqueName: \"kubernetes.io/projected/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-kube-api-access-4wp4h\") pod \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\" (UID: \"9cdb0877-bb91-4ea7-b1eb-79dc06b53592\") " Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.198966 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-utilities" (OuterVolumeSpecName: "utilities") pod "9cdb0877-bb91-4ea7-b1eb-79dc06b53592" (UID: "9cdb0877-bb91-4ea7-b1eb-79dc06b53592"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.203493 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-kube-api-access-4wp4h" (OuterVolumeSpecName: "kube-api-access-4wp4h") pod "9cdb0877-bb91-4ea7-b1eb-79dc06b53592" (UID: "9cdb0877-bb91-4ea7-b1eb-79dc06b53592"). InnerVolumeSpecName "kube-api-access-4wp4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.212620 4750 scope.go:117] "RemoveContainer" containerID="225967b62a0a3c9db87ba69878ef83c98fd4d002b6b485a3ba2e44f7c9932962" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.241600 4750 scope.go:117] "RemoveContainer" containerID="7167710fc93b29300ba5e867d0ed8d94f5a2083a6597592a2450ba7cc525c554" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.247030 4750 generic.go:334] "Generic (PLEG): container finished" podID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerID="1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5" exitCode=0 Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.247077 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2j2l" event={"ID":"9cdb0877-bb91-4ea7-b1eb-79dc06b53592","Type":"ContainerDied","Data":"1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5"} Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.247114 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2j2l" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.247142 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2j2l" event={"ID":"9cdb0877-bb91-4ea7-b1eb-79dc06b53592","Type":"ContainerDied","Data":"b5e73ae3767a431674ce6fa5836c1eb86e3847a60db39e86dafe2fac16cfe16f"} Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.247173 4750 scope.go:117] "RemoveContainer" containerID="1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.247211 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27vpr" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerName="registry-server" containerID="cri-o://951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323" gracePeriod=2 Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.262593 4750 scope.go:117] "RemoveContainer" containerID="295b9725591cb856aaa6df5c90337486a2c9085eabdd2a6955b57c6bb2f78110" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.270807 4750 scope.go:117] "RemoveContainer" containerID="4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.283965 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cdb0877-bb91-4ea7-b1eb-79dc06b53592" (UID: "9cdb0877-bb91-4ea7-b1eb-79dc06b53592"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.283976 4750 scope.go:117] "RemoveContainer" containerID="919a9750bdfbd9cb9a2ec586476afa3486a7c426aa6ae6693b2697ba44404b4a" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.297752 4750 scope.go:117] "RemoveContainer" containerID="4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.299160 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.299181 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.299194 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wp4h\" (UniqueName: \"kubernetes.io/projected/9cdb0877-bb91-4ea7-b1eb-79dc06b53592-kube-api-access-4wp4h\") on node \"crc\" DevicePath \"\"" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.367085 4750 scope.go:117] "RemoveContainer" containerID="0feb4ca16b6758c6207a2a94aa09e8a8463ace7881baffb89c3d90fa392320bd" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.374689 4750 scope.go:117] "RemoveContainer" containerID="1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5" Oct 08 18:39:01 crc kubenswrapper[4750]: E1008 18:39:01.375108 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5\": container with ID starting with 1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5 not found: ID does not exist" containerID="1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.375147 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5"} err="failed to get container status \"1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5\": rpc error: code = NotFound desc = could not find container \"1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5\": container with ID starting with 1330f06ba7f78d097933d9897c6c0c61f8280bc286c88c76dde759ac0c81eda5 not found: ID does not exist" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.375174 4750 scope.go:117] "RemoveContainer" containerID="4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d" Oct 08 18:39:01 crc kubenswrapper[4750]: E1008 18:39:01.375468 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d\": container with ID starting with 4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d not found: ID does not exist" containerID="4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.375508 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d"} err="failed to get container status \"4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d\": rpc error: code = NotFound desc = could not find container \"4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d\": container with ID starting with 4a8a1f1dc02a9c37e18ecc20f3012ace2a6f7fae44786daeedeec7bcdf43260d not found: ID does not exist" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.375536 4750 scope.go:117] "RemoveContainer" containerID="4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1" Oct 08 18:39:01 crc kubenswrapper[4750]: E1008 18:39:01.376040 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1\": container with ID starting with 4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1 not found: ID does not exist" containerID="4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.376080 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1"} err="failed to get container status \"4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1\": rpc error: code = NotFound desc = could not find container \"4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1\": container with ID starting with 4b1149b9050f265d82dd46261ed30f44bf46576f22da1577e817202dd7c800f1 not found: ID does not exist" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.582427 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2j2l"] Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.591621 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t2j2l"] Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.673696 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.734507 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:39:01 crc kubenswrapper[4750]: E1008 18:39:01.734893 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.805054 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-utilities\") pod \"d60974c8-e497-4ef5-8293-81ae0f64217d\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.805124 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7jvh\" (UniqueName: \"kubernetes.io/projected/d60974c8-e497-4ef5-8293-81ae0f64217d-kube-api-access-p7jvh\") pod \"d60974c8-e497-4ef5-8293-81ae0f64217d\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.805218 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-catalog-content\") pod \"d60974c8-e497-4ef5-8293-81ae0f64217d\" (UID: \"d60974c8-e497-4ef5-8293-81ae0f64217d\") " Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.806038 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-utilities" (OuterVolumeSpecName: "utilities") pod "d60974c8-e497-4ef5-8293-81ae0f64217d" (UID: "d60974c8-e497-4ef5-8293-81ae0f64217d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.809631 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60974c8-e497-4ef5-8293-81ae0f64217d-kube-api-access-p7jvh" (OuterVolumeSpecName: "kube-api-access-p7jvh") pod "d60974c8-e497-4ef5-8293-81ae0f64217d" (UID: "d60974c8-e497-4ef5-8293-81ae0f64217d"). InnerVolumeSpecName "kube-api-access-p7jvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.819033 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d60974c8-e497-4ef5-8293-81ae0f64217d" (UID: "d60974c8-e497-4ef5-8293-81ae0f64217d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.906857 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.907073 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7jvh\" (UniqueName: \"kubernetes.io/projected/d60974c8-e497-4ef5-8293-81ae0f64217d-kube-api-access-p7jvh\") on node \"crc\" DevicePath \"\"" Oct 08 18:39:01 crc kubenswrapper[4750]: I1008 18:39:01.907086 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60974c8-e497-4ef5-8293-81ae0f64217d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.271735 4750 generic.go:334] "Generic (PLEG): container finished" podID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerID="951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323" exitCode=0 Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.271806 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vpr" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.271821 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vpr" event={"ID":"d60974c8-e497-4ef5-8293-81ae0f64217d","Type":"ContainerDied","Data":"951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323"} Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.271862 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vpr" event={"ID":"d60974c8-e497-4ef5-8293-81ae0f64217d","Type":"ContainerDied","Data":"b56203eb6f07c97c32d8fcf8deec51b2ecf78bd9c47657ecc5816e1a38342cb2"} Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.271906 4750 scope.go:117] "RemoveContainer" containerID="951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.301889 4750 scope.go:117] "RemoveContainer" containerID="7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.309561 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vpr"] Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.314616 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vpr"] Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.342067 4750 scope.go:117] "RemoveContainer" containerID="07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.369248 4750 scope.go:117] "RemoveContainer" containerID="951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323" Oct 08 18:39:02 crc kubenswrapper[4750]: E1008 18:39:02.369758 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323\": container with ID starting with 951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323 not found: ID does not exist" containerID="951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.369790 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323"} err="failed to get container status \"951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323\": rpc error: code = NotFound desc = could not find container \"951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323\": container with ID starting with 951a63e07ca49261783f8c87c2c8b8ad70fa78f72fb0e30fe624814d0ccb5323 not found: ID does not exist" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.369811 4750 scope.go:117] "RemoveContainer" containerID="7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0" Oct 08 18:39:02 crc kubenswrapper[4750]: E1008 18:39:02.370267 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0\": container with ID starting with 7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0 not found: ID does not exist" containerID="7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.370326 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0"} err="failed to get container status \"7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0\": rpc error: code = NotFound desc = could not find container \"7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0\": container with ID starting with 7e2a1fceb3f242057c2127091d828fb36433b1a7d82728a0104afe725f09bcf0 not found: ID does not exist" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.370356 4750 scope.go:117] "RemoveContainer" containerID="07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2" Oct 08 18:39:02 crc kubenswrapper[4750]: E1008 18:39:02.370783 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2\": container with ID starting with 07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2 not found: ID does not exist" containerID="07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.370809 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2"} err="failed to get container status \"07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2\": rpc error: code = NotFound desc = could not find container \"07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2\": container with ID starting with 07227c9e1f4dc47ecddaaefcc9e39b779e40e4ed51eb95c34273aed2e79563a2 not found: ID does not exist" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.743138 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" path="/var/lib/kubelet/pods/9cdb0877-bb91-4ea7-b1eb-79dc06b53592/volumes" Oct 08 18:39:02 crc kubenswrapper[4750]: I1008 18:39:02.743836 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" path="/var/lib/kubelet/pods/d60974c8-e497-4ef5-8293-81ae0f64217d/volumes" Oct 08 18:39:12 crc kubenswrapper[4750]: I1008 18:39:12.734298 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:39:12 crc kubenswrapper[4750]: E1008 18:39:12.736387 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:39:23 crc kubenswrapper[4750]: I1008 18:39:23.734016 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:39:23 crc kubenswrapper[4750]: E1008 18:39:23.735501 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:39:37 crc kubenswrapper[4750]: I1008 18:39:37.735124 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:39:37 crc kubenswrapper[4750]: E1008 18:39:37.736215 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:39:50 crc kubenswrapper[4750]: I1008 18:39:50.734087 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:39:50 crc kubenswrapper[4750]: E1008 18:39:50.734709 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:40:01 crc kubenswrapper[4750]: I1008 18:40:01.452363 4750 scope.go:117] "RemoveContainer" containerID="818a7ca04219bc24f3b1bb2f35ec42283d410173dbbfcfda72a5ebabfcf269d4" Oct 08 18:40:01 crc kubenswrapper[4750]: I1008 18:40:01.733916 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:40:01 crc kubenswrapper[4750]: E1008 18:40:01.734377 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:40:14 crc kubenswrapper[4750]: I1008 18:40:14.738003 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:40:14 crc kubenswrapper[4750]: E1008 18:40:14.739470 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:40:29 crc kubenswrapper[4750]: I1008 18:40:29.734468 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:40:29 crc kubenswrapper[4750]: E1008 18:40:29.735397 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:40:42 crc kubenswrapper[4750]: I1008 18:40:42.733945 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:40:42 crc kubenswrapper[4750]: E1008 18:40:42.734530 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:40:54 crc kubenswrapper[4750]: I1008 18:40:54.743042 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:40:54 crc kubenswrapper[4750]: E1008 18:40:54.744406 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:41:09 crc kubenswrapper[4750]: I1008 18:41:09.734637 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:41:09 crc kubenswrapper[4750]: E1008 18:41:09.735703 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:41:23 crc kubenswrapper[4750]: I1008 18:41:23.735114 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:41:23 crc kubenswrapper[4750]: E1008 18:41:23.735789 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:41:34 crc kubenswrapper[4750]: I1008 18:41:34.741440 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:41:34 crc kubenswrapper[4750]: E1008 18:41:34.742503 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:41:48 crc kubenswrapper[4750]: I1008 18:41:48.734686 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:41:48 crc kubenswrapper[4750]: E1008 18:41:48.737289 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:41:59 crc kubenswrapper[4750]: I1008 18:41:59.734740 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:42:00 crc kubenswrapper[4750]: I1008 18:42:00.677111 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"1ab0ef4e5c2c6e8e89a6d6f1b1b2a4bf0c5bfd39f05fb8de38a6473bf97d0d5b"} Oct 08 18:43:59 crc kubenswrapper[4750]: I1008 18:43:59.707328 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:43:59 crc kubenswrapper[4750]: I1008 18:43:59.707971 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.314239 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h7wlw"] Oct 08 18:44:24 crc kubenswrapper[4750]: E1008 18:44:24.315185 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerName="extract-content" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.315202 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerName="extract-content" Oct 08 18:44:24 crc kubenswrapper[4750]: E1008 18:44:24.315219 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerName="extract-utilities" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.315227 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerName="extract-utilities" Oct 08 18:44:24 crc kubenswrapper[4750]: E1008 18:44:24.315240 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerName="registry-server" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.315248 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerName="registry-server" Oct 08 18:44:24 crc kubenswrapper[4750]: E1008 18:44:24.315264 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerName="extract-content" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.315271 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerName="extract-content" Oct 08 18:44:24 crc kubenswrapper[4750]: E1008 18:44:24.315284 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerName="extract-utilities" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.315293 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerName="extract-utilities" Oct 08 18:44:24 crc kubenswrapper[4750]: E1008 18:44:24.315313 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerName="registry-server" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.315319 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerName="registry-server" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.315476 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60974c8-e497-4ef5-8293-81ae0f64217d" containerName="registry-server" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.315503 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cdb0877-bb91-4ea7-b1eb-79dc06b53592" containerName="registry-server" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.316744 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.326446 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7wlw"] Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.449831 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-utilities\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.449899 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk6q8\" (UniqueName: \"kubernetes.io/projected/c39ab49d-02ea-4750-94a4-581ee934d81b-kube-api-access-jk6q8\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.449933 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-catalog-content\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.551111 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-catalog-content\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.551211 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-utilities\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.551260 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk6q8\" (UniqueName: \"kubernetes.io/projected/c39ab49d-02ea-4750-94a4-581ee934d81b-kube-api-access-jk6q8\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.551646 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-catalog-content\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.551735 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-utilities\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.570495 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk6q8\" (UniqueName: \"kubernetes.io/projected/c39ab49d-02ea-4750-94a4-581ee934d81b-kube-api-access-jk6q8\") pod \"community-operators-h7wlw\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:24 crc kubenswrapper[4750]: I1008 18:44:24.660427 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:25 crc kubenswrapper[4750]: I1008 18:44:25.161790 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7wlw"] Oct 08 18:44:25 crc kubenswrapper[4750]: I1008 18:44:25.800235 4750 generic.go:334] "Generic (PLEG): container finished" podID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerID="bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5" exitCode=0 Oct 08 18:44:25 crc kubenswrapper[4750]: I1008 18:44:25.800305 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7wlw" event={"ID":"c39ab49d-02ea-4750-94a4-581ee934d81b","Type":"ContainerDied","Data":"bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5"} Oct 08 18:44:25 crc kubenswrapper[4750]: I1008 18:44:25.800641 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7wlw" event={"ID":"c39ab49d-02ea-4750-94a4-581ee934d81b","Type":"ContainerStarted","Data":"9bc86c87596ac69a633608ec8a82997b8f118cb06d3160605cf7b33e39adcbf0"} Oct 08 18:44:25 crc kubenswrapper[4750]: I1008 18:44:25.802041 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 18:44:27 crc kubenswrapper[4750]: I1008 18:44:27.813817 4750 generic.go:334] "Generic (PLEG): container finished" podID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerID="887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5" exitCode=0 Oct 08 18:44:27 crc kubenswrapper[4750]: I1008 18:44:27.814733 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7wlw" event={"ID":"c39ab49d-02ea-4750-94a4-581ee934d81b","Type":"ContainerDied","Data":"887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5"} Oct 08 18:44:28 crc kubenswrapper[4750]: I1008 18:44:28.826746 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7wlw" event={"ID":"c39ab49d-02ea-4750-94a4-581ee934d81b","Type":"ContainerStarted","Data":"37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca"} Oct 08 18:44:28 crc kubenswrapper[4750]: I1008 18:44:28.857778 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h7wlw" podStartSLOduration=2.382189687 podStartE2EDuration="4.857755582s" podCreationTimestamp="2025-10-08 18:44:24 +0000 UTC" firstStartedPulling="2025-10-08 18:44:25.80177061 +0000 UTC m=+2021.714741633" lastFinishedPulling="2025-10-08 18:44:28.277336525 +0000 UTC m=+2024.190307528" observedRunningTime="2025-10-08 18:44:28.849802397 +0000 UTC m=+2024.762773410" watchObservedRunningTime="2025-10-08 18:44:28.857755582 +0000 UTC m=+2024.770726595" Oct 08 18:44:29 crc kubenswrapper[4750]: I1008 18:44:29.707153 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:44:29 crc kubenswrapper[4750]: I1008 18:44:29.707245 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:44:34 crc kubenswrapper[4750]: I1008 18:44:34.660898 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:34 crc kubenswrapper[4750]: I1008 18:44:34.661247 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:34 crc kubenswrapper[4750]: I1008 18:44:34.710176 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:34 crc kubenswrapper[4750]: I1008 18:44:34.909258 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:34 crc kubenswrapper[4750]: I1008 18:44:34.949664 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7wlw"] Oct 08 18:44:36 crc kubenswrapper[4750]: I1008 18:44:36.883800 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h7wlw" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerName="registry-server" containerID="cri-o://37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca" gracePeriod=2 Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.280079 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.332119 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-catalog-content\") pod \"c39ab49d-02ea-4750-94a4-581ee934d81b\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.332449 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk6q8\" (UniqueName: \"kubernetes.io/projected/c39ab49d-02ea-4750-94a4-581ee934d81b-kube-api-access-jk6q8\") pod \"c39ab49d-02ea-4750-94a4-581ee934d81b\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.332519 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-utilities\") pod \"c39ab49d-02ea-4750-94a4-581ee934d81b\" (UID: \"c39ab49d-02ea-4750-94a4-581ee934d81b\") " Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.333617 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-utilities" (OuterVolumeSpecName: "utilities") pod "c39ab49d-02ea-4750-94a4-581ee934d81b" (UID: "c39ab49d-02ea-4750-94a4-581ee934d81b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.339911 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39ab49d-02ea-4750-94a4-581ee934d81b-kube-api-access-jk6q8" (OuterVolumeSpecName: "kube-api-access-jk6q8") pod "c39ab49d-02ea-4750-94a4-581ee934d81b" (UID: "c39ab49d-02ea-4750-94a4-581ee934d81b"). InnerVolumeSpecName "kube-api-access-jk6q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.388632 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c39ab49d-02ea-4750-94a4-581ee934d81b" (UID: "c39ab49d-02ea-4750-94a4-581ee934d81b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.434187 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.434221 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39ab49d-02ea-4750-94a4-581ee934d81b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.434231 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk6q8\" (UniqueName: \"kubernetes.io/projected/c39ab49d-02ea-4750-94a4-581ee934d81b-kube-api-access-jk6q8\") on node \"crc\" DevicePath \"\"" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.902442 4750 generic.go:334] "Generic (PLEG): container finished" podID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerID="37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca" exitCode=0 Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.902521 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7wlw" event={"ID":"c39ab49d-02ea-4750-94a4-581ee934d81b","Type":"ContainerDied","Data":"37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca"} Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.902597 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7wlw" event={"ID":"c39ab49d-02ea-4750-94a4-581ee934d81b","Type":"ContainerDied","Data":"9bc86c87596ac69a633608ec8a82997b8f118cb06d3160605cf7b33e39adcbf0"} Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.902630 4750 scope.go:117] "RemoveContainer" containerID="37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.902797 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7wlw" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.941150 4750 scope.go:117] "RemoveContainer" containerID="887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.947170 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7wlw"] Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.952983 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h7wlw"] Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.966135 4750 scope.go:117] "RemoveContainer" containerID="bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.988340 4750 scope.go:117] "RemoveContainer" containerID="37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca" Oct 08 18:44:37 crc kubenswrapper[4750]: E1008 18:44:37.988727 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca\": container with ID starting with 37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca not found: ID does not exist" containerID="37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.988761 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca"} err="failed to get container status \"37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca\": rpc error: code = NotFound desc = could not find container \"37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca\": container with ID starting with 37df49d55b5c195d10c2e9dc2afee4653e8e48681c79ff876bee09ec2257bdca not found: ID does not exist" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.988782 4750 scope.go:117] "RemoveContainer" containerID="887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5" Oct 08 18:44:37 crc kubenswrapper[4750]: E1008 18:44:37.989017 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5\": container with ID starting with 887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5 not found: ID does not exist" containerID="887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.989068 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5"} err="failed to get container status \"887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5\": rpc error: code = NotFound desc = could not find container \"887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5\": container with ID starting with 887a682812f5bffa4ee86dec7193239c029a1a099f4c8fbaea02daa54c7badf5 not found: ID does not exist" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.989096 4750 scope.go:117] "RemoveContainer" containerID="bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5" Oct 08 18:44:37 crc kubenswrapper[4750]: E1008 18:44:37.989356 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5\": container with ID starting with bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5 not found: ID does not exist" containerID="bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5" Oct 08 18:44:37 crc kubenswrapper[4750]: I1008 18:44:37.989376 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5"} err="failed to get container status \"bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5\": rpc error: code = NotFound desc = could not find container \"bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5\": container with ID starting with bfdba72dec51264b62bbd59a948000edc75962cfe6263134e1e475cb1aa786c5 not found: ID does not exist" Oct 08 18:44:38 crc kubenswrapper[4750]: I1008 18:44:38.746767 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" path="/var/lib/kubelet/pods/c39ab49d-02ea-4750-94a4-581ee934d81b/volumes" Oct 08 18:44:59 crc kubenswrapper[4750]: I1008 18:44:59.706913 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:44:59 crc kubenswrapper[4750]: I1008 18:44:59.707578 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:44:59 crc kubenswrapper[4750]: I1008 18:44:59.707632 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:44:59 crc kubenswrapper[4750]: I1008 18:44:59.708343 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ab0ef4e5c2c6e8e89a6d6f1b1b2a4bf0c5bfd39f05fb8de38a6473bf97d0d5b"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:44:59 crc kubenswrapper[4750]: I1008 18:44:59.708507 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://1ab0ef4e5c2c6e8e89a6d6f1b1b2a4bf0c5bfd39f05fb8de38a6473bf97d0d5b" gracePeriod=600 Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.094089 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="1ab0ef4e5c2c6e8e89a6d6f1b1b2a4bf0c5bfd39f05fb8de38a6473bf97d0d5b" exitCode=0 Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.094133 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"1ab0ef4e5c2c6e8e89a6d6f1b1b2a4bf0c5bfd39f05fb8de38a6473bf97d0d5b"} Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.094192 4750 scope.go:117] "RemoveContainer" containerID="aaa1f85033e435ca45d798ce82eaed87393be7930e3bf3fb0fa59144d843aac6" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.147143 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq"] Oct 08 18:45:00 crc kubenswrapper[4750]: E1008 18:45:00.147505 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerName="extract-content" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.147523 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerName="extract-content" Oct 08 18:45:00 crc kubenswrapper[4750]: E1008 18:45:00.147542 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerName="extract-utilities" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.147567 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerName="extract-utilities" Oct 08 18:45:00 crc kubenswrapper[4750]: E1008 18:45:00.147585 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerName="registry-server" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.147593 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerName="registry-server" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.147806 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39ab49d-02ea-4750-94a4-581ee934d81b" containerName="registry-server" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.149099 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.151298 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.151304 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.158062 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq"] Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.305644 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-secret-volume\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.305720 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52sfq\" (UniqueName: \"kubernetes.io/projected/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-kube-api-access-52sfq\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.305751 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-config-volume\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.406658 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-config-volume\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.406746 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-secret-volume\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.406799 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52sfq\" (UniqueName: \"kubernetes.io/projected/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-kube-api-access-52sfq\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.407511 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-config-volume\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.413268 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-secret-volume\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.422959 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52sfq\" (UniqueName: \"kubernetes.io/projected/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-kube-api-access-52sfq\") pod \"collect-profiles-29332485-9g9tq\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.518815 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:00 crc kubenswrapper[4750]: I1008 18:45:00.940302 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq"] Oct 08 18:45:01 crc kubenswrapper[4750]: I1008 18:45:01.106721 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331"} Oct 08 18:45:01 crc kubenswrapper[4750]: I1008 18:45:01.110012 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" event={"ID":"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8","Type":"ContainerStarted","Data":"e8fcf50615f9b0e23cd6c0966a87cfe97c6a31cb6b767963ee77633c6fe9ffe7"} Oct 08 18:45:02 crc kubenswrapper[4750]: I1008 18:45:02.122073 4750 generic.go:334] "Generic (PLEG): container finished" podID="6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8" containerID="84069d1173f634d8410ba0bb1baabff7ff57b42e929616e3aed04a6ebaf6aaf4" exitCode=0 Oct 08 18:45:02 crc kubenswrapper[4750]: I1008 18:45:02.122273 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" event={"ID":"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8","Type":"ContainerDied","Data":"84069d1173f634d8410ba0bb1baabff7ff57b42e929616e3aed04a6ebaf6aaf4"} Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.448537 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.550591 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52sfq\" (UniqueName: \"kubernetes.io/projected/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-kube-api-access-52sfq\") pod \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.550738 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-config-volume\") pod \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.550777 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-secret-volume\") pod \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\" (UID: \"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8\") " Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.551467 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-config-volume" (OuterVolumeSpecName: "config-volume") pod "6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8" (UID: "6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.556654 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8" (UID: "6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.556695 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-kube-api-access-52sfq" (OuterVolumeSpecName: "kube-api-access-52sfq") pod "6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8" (UID: "6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8"). InnerVolumeSpecName "kube-api-access-52sfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.652064 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.652405 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 18:45:03 crc kubenswrapper[4750]: I1008 18:45:03.652419 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52sfq\" (UniqueName: \"kubernetes.io/projected/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8-kube-api-access-52sfq\") on node \"crc\" DevicePath \"\"" Oct 08 18:45:04 crc kubenswrapper[4750]: I1008 18:45:04.136686 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" event={"ID":"6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8","Type":"ContainerDied","Data":"e8fcf50615f9b0e23cd6c0966a87cfe97c6a31cb6b767963ee77633c6fe9ffe7"} Oct 08 18:45:04 crc kubenswrapper[4750]: I1008 18:45:04.136765 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq" Oct 08 18:45:04 crc kubenswrapper[4750]: I1008 18:45:04.136779 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8fcf50615f9b0e23cd6c0966a87cfe97c6a31cb6b767963ee77633c6fe9ffe7" Oct 08 18:45:04 crc kubenswrapper[4750]: I1008 18:45:04.529215 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc"] Oct 08 18:45:04 crc kubenswrapper[4750]: I1008 18:45:04.535428 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332440-n7qpc"] Oct 08 18:45:04 crc kubenswrapper[4750]: I1008 18:45:04.744101 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4227b4db-94b2-4e5f-a231-aa0b6bbe685c" path="/var/lib/kubelet/pods/4227b4db-94b2-4e5f-a231-aa0b6bbe685c/volumes" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.372405 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2nbkd"] Oct 08 18:45:27 crc kubenswrapper[4750]: E1008 18:45:27.373229 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8" containerName="collect-profiles" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.373240 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8" containerName="collect-profiles" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.373369 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8" containerName="collect-profiles" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.374309 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.398381 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2nbkd"] Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.534792 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-catalog-content\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.534908 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-utilities\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.534989 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwpv\" (UniqueName: \"kubernetes.io/projected/d70e9f10-ea85-44b1-9758-4084182949b6-kube-api-access-4zwpv\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.636298 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-utilities\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.636411 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwpv\" (UniqueName: \"kubernetes.io/projected/d70e9f10-ea85-44b1-9758-4084182949b6-kube-api-access-4zwpv\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.636473 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-catalog-content\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.636886 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-utilities\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.636925 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-catalog-content\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.659667 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwpv\" (UniqueName: \"kubernetes.io/projected/d70e9f10-ea85-44b1-9758-4084182949b6-kube-api-access-4zwpv\") pod \"certified-operators-2nbkd\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:27 crc kubenswrapper[4750]: I1008 18:45:27.709871 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:28 crc kubenswrapper[4750]: I1008 18:45:28.183839 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2nbkd"] Oct 08 18:45:28 crc kubenswrapper[4750]: I1008 18:45:28.329312 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbkd" event={"ID":"d70e9f10-ea85-44b1-9758-4084182949b6","Type":"ContainerStarted","Data":"a5cb3210b6f6b40e5ee44a25b6d9c19c5accb63c9bbd0b42394df7208473ed99"} Oct 08 18:45:29 crc kubenswrapper[4750]: I1008 18:45:29.340155 4750 generic.go:334] "Generic (PLEG): container finished" podID="d70e9f10-ea85-44b1-9758-4084182949b6" containerID="4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8" exitCode=0 Oct 08 18:45:29 crc kubenswrapper[4750]: I1008 18:45:29.340212 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbkd" event={"ID":"d70e9f10-ea85-44b1-9758-4084182949b6","Type":"ContainerDied","Data":"4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8"} Oct 08 18:45:30 crc kubenswrapper[4750]: I1008 18:45:30.349161 4750 generic.go:334] "Generic (PLEG): container finished" podID="d70e9f10-ea85-44b1-9758-4084182949b6" containerID="614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33" exitCode=0 Oct 08 18:45:30 crc kubenswrapper[4750]: I1008 18:45:30.349351 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbkd" event={"ID":"d70e9f10-ea85-44b1-9758-4084182949b6","Type":"ContainerDied","Data":"614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33"} Oct 08 18:45:31 crc kubenswrapper[4750]: I1008 18:45:31.358230 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbkd" event={"ID":"d70e9f10-ea85-44b1-9758-4084182949b6","Type":"ContainerStarted","Data":"e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb"} Oct 08 18:45:31 crc kubenswrapper[4750]: I1008 18:45:31.390754 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2nbkd" podStartSLOduration=2.945750724 podStartE2EDuration="4.390728929s" podCreationTimestamp="2025-10-08 18:45:27 +0000 UTC" firstStartedPulling="2025-10-08 18:45:29.342183169 +0000 UTC m=+2085.255154212" lastFinishedPulling="2025-10-08 18:45:30.787161404 +0000 UTC m=+2086.700132417" observedRunningTime="2025-10-08 18:45:31.3817854 +0000 UTC m=+2087.294756423" watchObservedRunningTime="2025-10-08 18:45:31.390728929 +0000 UTC m=+2087.303699982" Oct 08 18:45:37 crc kubenswrapper[4750]: I1008 18:45:37.710877 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:37 crc kubenswrapper[4750]: I1008 18:45:37.711848 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:37 crc kubenswrapper[4750]: I1008 18:45:37.781280 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:38 crc kubenswrapper[4750]: I1008 18:45:38.462293 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:38 crc kubenswrapper[4750]: I1008 18:45:38.514498 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2nbkd"] Oct 08 18:45:40 crc kubenswrapper[4750]: I1008 18:45:40.431980 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2nbkd" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" containerName="registry-server" containerID="cri-o://e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb" gracePeriod=2 Oct 08 18:45:40 crc kubenswrapper[4750]: I1008 18:45:40.869672 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.032720 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-catalog-content\") pod \"d70e9f10-ea85-44b1-9758-4084182949b6\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.032813 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zwpv\" (UniqueName: \"kubernetes.io/projected/d70e9f10-ea85-44b1-9758-4084182949b6-kube-api-access-4zwpv\") pod \"d70e9f10-ea85-44b1-9758-4084182949b6\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.032922 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-utilities\") pod \"d70e9f10-ea85-44b1-9758-4084182949b6\" (UID: \"d70e9f10-ea85-44b1-9758-4084182949b6\") " Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.034729 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-utilities" (OuterVolumeSpecName: "utilities") pod "d70e9f10-ea85-44b1-9758-4084182949b6" (UID: "d70e9f10-ea85-44b1-9758-4084182949b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.052924 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70e9f10-ea85-44b1-9758-4084182949b6-kube-api-access-4zwpv" (OuterVolumeSpecName: "kube-api-access-4zwpv") pod "d70e9f10-ea85-44b1-9758-4084182949b6" (UID: "d70e9f10-ea85-44b1-9758-4084182949b6"). InnerVolumeSpecName "kube-api-access-4zwpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.134779 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.134812 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zwpv\" (UniqueName: \"kubernetes.io/projected/d70e9f10-ea85-44b1-9758-4084182949b6-kube-api-access-4zwpv\") on node \"crc\" DevicePath \"\"" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.442680 4750 generic.go:334] "Generic (PLEG): container finished" podID="d70e9f10-ea85-44b1-9758-4084182949b6" containerID="e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb" exitCode=0 Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.442729 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbkd" event={"ID":"d70e9f10-ea85-44b1-9758-4084182949b6","Type":"ContainerDied","Data":"e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb"} Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.442761 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nbkd" event={"ID":"d70e9f10-ea85-44b1-9758-4084182949b6","Type":"ContainerDied","Data":"a5cb3210b6f6b40e5ee44a25b6d9c19c5accb63c9bbd0b42394df7208473ed99"} Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.442783 4750 scope.go:117] "RemoveContainer" containerID="e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.442923 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nbkd" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.463109 4750 scope.go:117] "RemoveContainer" containerID="614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.492473 4750 scope.go:117] "RemoveContainer" containerID="4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.517144 4750 scope.go:117] "RemoveContainer" containerID="e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb" Oct 08 18:45:41 crc kubenswrapper[4750]: E1008 18:45:41.517752 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb\": container with ID starting with e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb not found: ID does not exist" containerID="e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.517804 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb"} err="failed to get container status \"e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb\": rpc error: code = NotFound desc = could not find container \"e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb\": container with ID starting with e4cfb409d44a535cdac9ba3d44a20b564115bc4eb33c7115c30bd8e810a29ffb not found: ID does not exist" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.517827 4750 scope.go:117] "RemoveContainer" containerID="614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33" Oct 08 18:45:41 crc kubenswrapper[4750]: E1008 18:45:41.518109 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33\": container with ID starting with 614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33 not found: ID does not exist" containerID="614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.518155 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33"} err="failed to get container status \"614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33\": rpc error: code = NotFound desc = could not find container \"614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33\": container with ID starting with 614bde01b78c9db205bfc2eeb5dc35c9514e9378270206cb569062c4cf646a33 not found: ID does not exist" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.518169 4750 scope.go:117] "RemoveContainer" containerID="4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8" Oct 08 18:45:41 crc kubenswrapper[4750]: E1008 18:45:41.518488 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8\": container with ID starting with 4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8 not found: ID does not exist" containerID="4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.518707 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8"} err="failed to get container status \"4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8\": rpc error: code = NotFound desc = could not find container \"4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8\": container with ID starting with 4453059c1e48f03dfd676c08696d0eb73d31286108a7d10c712a2e77c9ebf0a8 not found: ID does not exist" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.917927 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d70e9f10-ea85-44b1-9758-4084182949b6" (UID: "d70e9f10-ea85-44b1-9758-4084182949b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:45:41 crc kubenswrapper[4750]: I1008 18:45:41.945032 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d70e9f10-ea85-44b1-9758-4084182949b6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:45:42 crc kubenswrapper[4750]: I1008 18:45:42.077305 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2nbkd"] Oct 08 18:45:42 crc kubenswrapper[4750]: I1008 18:45:42.085047 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2nbkd"] Oct 08 18:45:42 crc kubenswrapper[4750]: I1008 18:45:42.750808 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" path="/var/lib/kubelet/pods/d70e9f10-ea85-44b1-9758-4084182949b6/volumes" Oct 08 18:46:01 crc kubenswrapper[4750]: I1008 18:46:01.581822 4750 scope.go:117] "RemoveContainer" containerID="6b18ab44745452e81dd04f4fa6f42e21f155140ec9d7b446a428b84b4d6af79e" Oct 08 18:47:29 crc kubenswrapper[4750]: I1008 18:47:29.707601 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:47:29 crc kubenswrapper[4750]: I1008 18:47:29.708211 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:47:59 crc kubenswrapper[4750]: I1008 18:47:59.708050 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:47:59 crc kubenswrapper[4750]: I1008 18:47:59.708572 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:48:29 crc kubenswrapper[4750]: I1008 18:48:29.707283 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:48:29 crc kubenswrapper[4750]: I1008 18:48:29.707930 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:48:29 crc kubenswrapper[4750]: I1008 18:48:29.707990 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:48:29 crc kubenswrapper[4750]: I1008 18:48:29.708720 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:48:29 crc kubenswrapper[4750]: I1008 18:48:29.708778 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" gracePeriod=600 Oct 08 18:48:29 crc kubenswrapper[4750]: E1008 18:48:29.833417 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:48:30 crc kubenswrapper[4750]: I1008 18:48:30.801866 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" exitCode=0 Oct 08 18:48:30 crc kubenswrapper[4750]: I1008 18:48:30.801964 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331"} Oct 08 18:48:30 crc kubenswrapper[4750]: I1008 18:48:30.802222 4750 scope.go:117] "RemoveContainer" containerID="1ab0ef4e5c2c6e8e89a6d6f1b1b2a4bf0c5bfd39f05fb8de38a6473bf97d0d5b" Oct 08 18:48:30 crc kubenswrapper[4750]: I1008 18:48:30.802748 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:48:30 crc kubenswrapper[4750]: E1008 18:48:30.802971 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:48:44 crc kubenswrapper[4750]: I1008 18:48:44.746211 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:48:44 crc kubenswrapper[4750]: E1008 18:48:44.747183 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:48:59 crc kubenswrapper[4750]: I1008 18:48:59.734580 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:48:59 crc kubenswrapper[4750]: E1008 18:48:59.735357 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.598049 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xj852"] Oct 08 18:49:03 crc kubenswrapper[4750]: E1008 18:49:03.598673 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" containerName="extract-content" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.598690 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" containerName="extract-content" Oct 08 18:49:03 crc kubenswrapper[4750]: E1008 18:49:03.598711 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" containerName="registry-server" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.598718 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" containerName="registry-server" Oct 08 18:49:03 crc kubenswrapper[4750]: E1008 18:49:03.598734 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" containerName="extract-utilities" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.598742 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" containerName="extract-utilities" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.598920 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70e9f10-ea85-44b1-9758-4084182949b6" containerName="registry-server" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.600166 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.628114 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xj852"] Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.799494 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbb7h\" (UniqueName: \"kubernetes.io/projected/0885f459-e94d-47d5-ad13-0494430a15ad-kube-api-access-dbb7h\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.799577 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-utilities\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.799670 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-catalog-content\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.900671 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-catalog-content\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.900753 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbb7h\" (UniqueName: \"kubernetes.io/projected/0885f459-e94d-47d5-ad13-0494430a15ad-kube-api-access-dbb7h\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.900796 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-utilities\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.901185 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-catalog-content\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.901303 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-utilities\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.926565 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbb7h\" (UniqueName: \"kubernetes.io/projected/0885f459-e94d-47d5-ad13-0494430a15ad-kube-api-access-dbb7h\") pod \"redhat-marketplace-xj852\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:03 crc kubenswrapper[4750]: I1008 18:49:03.939727 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:04 crc kubenswrapper[4750]: I1008 18:49:04.358540 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xj852"] Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.071210 4750 generic.go:334] "Generic (PLEG): container finished" podID="0885f459-e94d-47d5-ad13-0494430a15ad" containerID="c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df" exitCode=0 Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.071325 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xj852" event={"ID":"0885f459-e94d-47d5-ad13-0494430a15ad","Type":"ContainerDied","Data":"c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df"} Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.071537 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xj852" event={"ID":"0885f459-e94d-47d5-ad13-0494430a15ad","Type":"ContainerStarted","Data":"71c85ef1187bda98c514a6bd7507391578769411dbad9dd10cb1e1ca65530b11"} Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.391104 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z54jv"] Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.392529 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.396296 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z54jv"] Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.520753 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-utilities\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.521128 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf7tg\" (UniqueName: \"kubernetes.io/projected/11617628-2ff6-4464-91f0-d0198019356e-kube-api-access-sf7tg\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.521300 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-catalog-content\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.622374 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf7tg\" (UniqueName: \"kubernetes.io/projected/11617628-2ff6-4464-91f0-d0198019356e-kube-api-access-sf7tg\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.622467 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-catalog-content\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.622577 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-utilities\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.623051 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-catalog-content\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.623082 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-utilities\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.651125 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf7tg\" (UniqueName: \"kubernetes.io/projected/11617628-2ff6-4464-91f0-d0198019356e-kube-api-access-sf7tg\") pod \"redhat-operators-z54jv\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:05 crc kubenswrapper[4750]: I1008 18:49:05.706417 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:06 crc kubenswrapper[4750]: I1008 18:49:06.079890 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xj852" event={"ID":"0885f459-e94d-47d5-ad13-0494430a15ad","Type":"ContainerStarted","Data":"cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b"} Oct 08 18:49:06 crc kubenswrapper[4750]: I1008 18:49:06.134985 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z54jv"] Oct 08 18:49:06 crc kubenswrapper[4750]: W1008 18:49:06.135758 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11617628_2ff6_4464_91f0_d0198019356e.slice/crio-6bcff43f60f9fb6b7d0930e75191832585fb2a36e4fdb5dc132bde0a1b9a5861 WatchSource:0}: Error finding container 6bcff43f60f9fb6b7d0930e75191832585fb2a36e4fdb5dc132bde0a1b9a5861: Status 404 returned error can't find the container with id 6bcff43f60f9fb6b7d0930e75191832585fb2a36e4fdb5dc132bde0a1b9a5861 Oct 08 18:49:07 crc kubenswrapper[4750]: I1008 18:49:07.089819 4750 generic.go:334] "Generic (PLEG): container finished" podID="0885f459-e94d-47d5-ad13-0494430a15ad" containerID="cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b" exitCode=0 Oct 08 18:49:07 crc kubenswrapper[4750]: I1008 18:49:07.089890 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xj852" event={"ID":"0885f459-e94d-47d5-ad13-0494430a15ad","Type":"ContainerDied","Data":"cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b"} Oct 08 18:49:07 crc kubenswrapper[4750]: I1008 18:49:07.091729 4750 generic.go:334] "Generic (PLEG): container finished" podID="11617628-2ff6-4464-91f0-d0198019356e" containerID="cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca" exitCode=0 Oct 08 18:49:07 crc kubenswrapper[4750]: I1008 18:49:07.091752 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z54jv" event={"ID":"11617628-2ff6-4464-91f0-d0198019356e","Type":"ContainerDied","Data":"cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca"} Oct 08 18:49:07 crc kubenswrapper[4750]: I1008 18:49:07.091766 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z54jv" event={"ID":"11617628-2ff6-4464-91f0-d0198019356e","Type":"ContainerStarted","Data":"6bcff43f60f9fb6b7d0930e75191832585fb2a36e4fdb5dc132bde0a1b9a5861"} Oct 08 18:49:08 crc kubenswrapper[4750]: I1008 18:49:08.099060 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z54jv" event={"ID":"11617628-2ff6-4464-91f0-d0198019356e","Type":"ContainerStarted","Data":"ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3"} Oct 08 18:49:08 crc kubenswrapper[4750]: I1008 18:49:08.101167 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xj852" event={"ID":"0885f459-e94d-47d5-ad13-0494430a15ad","Type":"ContainerStarted","Data":"0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1"} Oct 08 18:49:08 crc kubenswrapper[4750]: I1008 18:49:08.135864 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xj852" podStartSLOduration=2.566979038 podStartE2EDuration="5.13584646s" podCreationTimestamp="2025-10-08 18:49:03 +0000 UTC" firstStartedPulling="2025-10-08 18:49:05.072932854 +0000 UTC m=+2300.985903867" lastFinishedPulling="2025-10-08 18:49:07.641800226 +0000 UTC m=+2303.554771289" observedRunningTime="2025-10-08 18:49:08.135131053 +0000 UTC m=+2304.048102086" watchObservedRunningTime="2025-10-08 18:49:08.13584646 +0000 UTC m=+2304.048817483" Oct 08 18:49:09 crc kubenswrapper[4750]: I1008 18:49:09.111432 4750 generic.go:334] "Generic (PLEG): container finished" podID="11617628-2ff6-4464-91f0-d0198019356e" containerID="ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3" exitCode=0 Oct 08 18:49:09 crc kubenswrapper[4750]: I1008 18:49:09.111484 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z54jv" event={"ID":"11617628-2ff6-4464-91f0-d0198019356e","Type":"ContainerDied","Data":"ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3"} Oct 08 18:49:10 crc kubenswrapper[4750]: I1008 18:49:10.124065 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z54jv" event={"ID":"11617628-2ff6-4464-91f0-d0198019356e","Type":"ContainerStarted","Data":"9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3"} Oct 08 18:49:10 crc kubenswrapper[4750]: I1008 18:49:10.147742 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z54jv" podStartSLOduration=2.709581476 podStartE2EDuration="5.147721934s" podCreationTimestamp="2025-10-08 18:49:05 +0000 UTC" firstStartedPulling="2025-10-08 18:49:07.093933852 +0000 UTC m=+2303.006904895" lastFinishedPulling="2025-10-08 18:49:09.53207434 +0000 UTC m=+2305.445045353" observedRunningTime="2025-10-08 18:49:10.145559632 +0000 UTC m=+2306.058530655" watchObservedRunningTime="2025-10-08 18:49:10.147721934 +0000 UTC m=+2306.060692947" Oct 08 18:49:13 crc kubenswrapper[4750]: I1008 18:49:13.940469 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:13 crc kubenswrapper[4750]: I1008 18:49:13.940946 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:13 crc kubenswrapper[4750]: I1008 18:49:13.984907 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:14 crc kubenswrapper[4750]: I1008 18:49:14.199841 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:14 crc kubenswrapper[4750]: I1008 18:49:14.738739 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:49:14 crc kubenswrapper[4750]: E1008 18:49:14.739380 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:49:14 crc kubenswrapper[4750]: I1008 18:49:14.780804 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xj852"] Oct 08 18:49:15 crc kubenswrapper[4750]: I1008 18:49:15.707269 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:15 crc kubenswrapper[4750]: I1008 18:49:15.707339 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:15 crc kubenswrapper[4750]: I1008 18:49:15.759882 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.171110 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xj852" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" containerName="registry-server" containerID="cri-o://0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1" gracePeriod=2 Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.234669 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.579582 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.774196 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-utilities\") pod \"0885f459-e94d-47d5-ad13-0494430a15ad\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.774339 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbb7h\" (UniqueName: \"kubernetes.io/projected/0885f459-e94d-47d5-ad13-0494430a15ad-kube-api-access-dbb7h\") pod \"0885f459-e94d-47d5-ad13-0494430a15ad\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.774401 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-catalog-content\") pod \"0885f459-e94d-47d5-ad13-0494430a15ad\" (UID: \"0885f459-e94d-47d5-ad13-0494430a15ad\") " Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.775104 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-utilities" (OuterVolumeSpecName: "utilities") pod "0885f459-e94d-47d5-ad13-0494430a15ad" (UID: "0885f459-e94d-47d5-ad13-0494430a15ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.780723 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0885f459-e94d-47d5-ad13-0494430a15ad-kube-api-access-dbb7h" (OuterVolumeSpecName: "kube-api-access-dbb7h") pod "0885f459-e94d-47d5-ad13-0494430a15ad" (UID: "0885f459-e94d-47d5-ad13-0494430a15ad"). InnerVolumeSpecName "kube-api-access-dbb7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.787607 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0885f459-e94d-47d5-ad13-0494430a15ad" (UID: "0885f459-e94d-47d5-ad13-0494430a15ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.875635 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.875695 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbb7h\" (UniqueName: \"kubernetes.io/projected/0885f459-e94d-47d5-ad13-0494430a15ad-kube-api-access-dbb7h\") on node \"crc\" DevicePath \"\"" Oct 08 18:49:16 crc kubenswrapper[4750]: I1008 18:49:16.875711 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0885f459-e94d-47d5-ad13-0494430a15ad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.183134 4750 generic.go:334] "Generic (PLEG): container finished" podID="0885f459-e94d-47d5-ad13-0494430a15ad" containerID="0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1" exitCode=0 Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.183203 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xj852" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.183223 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xj852" event={"ID":"0885f459-e94d-47d5-ad13-0494430a15ad","Type":"ContainerDied","Data":"0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1"} Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.183293 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xj852" event={"ID":"0885f459-e94d-47d5-ad13-0494430a15ad","Type":"ContainerDied","Data":"71c85ef1187bda98c514a6bd7507391578769411dbad9dd10cb1e1ca65530b11"} Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.183329 4750 scope.go:117] "RemoveContainer" containerID="0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.222027 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xj852"] Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.223202 4750 scope.go:117] "RemoveContainer" containerID="cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.226810 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xj852"] Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.253851 4750 scope.go:117] "RemoveContainer" containerID="c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.299345 4750 scope.go:117] "RemoveContainer" containerID="0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1" Oct 08 18:49:17 crc kubenswrapper[4750]: E1008 18:49:17.299797 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1\": container with ID starting with 0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1 not found: ID does not exist" containerID="0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.299840 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1"} err="failed to get container status \"0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1\": rpc error: code = NotFound desc = could not find container \"0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1\": container with ID starting with 0f311a02d7eb66c16046ca2fb960109b33108fb7b7c5c80b7fc7f47e1ecb22f1 not found: ID does not exist" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.299866 4750 scope.go:117] "RemoveContainer" containerID="cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b" Oct 08 18:49:17 crc kubenswrapper[4750]: E1008 18:49:17.300198 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b\": container with ID starting with cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b not found: ID does not exist" containerID="cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.300259 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b"} err="failed to get container status \"cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b\": rpc error: code = NotFound desc = could not find container \"cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b\": container with ID starting with cf222178043b9e9976a0f7b1c7a87e8d18f95ba90add4bfeacc0ae19497f763b not found: ID does not exist" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.300299 4750 scope.go:117] "RemoveContainer" containerID="c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df" Oct 08 18:49:17 crc kubenswrapper[4750]: E1008 18:49:17.300919 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df\": container with ID starting with c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df not found: ID does not exist" containerID="c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.300946 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df"} err="failed to get container status \"c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df\": rpc error: code = NotFound desc = could not find container \"c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df\": container with ID starting with c53f596891cf4295c2d357035dd714bf80264d1f66d6b16088ee16fc531f51df not found: ID does not exist" Oct 08 18:49:17 crc kubenswrapper[4750]: I1008 18:49:17.584110 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z54jv"] Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.194437 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z54jv" podUID="11617628-2ff6-4464-91f0-d0198019356e" containerName="registry-server" containerID="cri-o://9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3" gracePeriod=2 Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.555247 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.700036 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf7tg\" (UniqueName: \"kubernetes.io/projected/11617628-2ff6-4464-91f0-d0198019356e-kube-api-access-sf7tg\") pod \"11617628-2ff6-4464-91f0-d0198019356e\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.700163 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-utilities\") pod \"11617628-2ff6-4464-91f0-d0198019356e\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.700217 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-catalog-content\") pod \"11617628-2ff6-4464-91f0-d0198019356e\" (UID: \"11617628-2ff6-4464-91f0-d0198019356e\") " Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.701140 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-utilities" (OuterVolumeSpecName: "utilities") pod "11617628-2ff6-4464-91f0-d0198019356e" (UID: "11617628-2ff6-4464-91f0-d0198019356e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.703757 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11617628-2ff6-4464-91f0-d0198019356e-kube-api-access-sf7tg" (OuterVolumeSpecName: "kube-api-access-sf7tg") pod "11617628-2ff6-4464-91f0-d0198019356e" (UID: "11617628-2ff6-4464-91f0-d0198019356e"). InnerVolumeSpecName "kube-api-access-sf7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.742484 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" path="/var/lib/kubelet/pods/0885f459-e94d-47d5-ad13-0494430a15ad/volumes" Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.802118 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf7tg\" (UniqueName: \"kubernetes.io/projected/11617628-2ff6-4464-91f0-d0198019356e-kube-api-access-sf7tg\") on node \"crc\" DevicePath \"\"" Oct 08 18:49:18 crc kubenswrapper[4750]: I1008 18:49:18.802163 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.207254 4750 generic.go:334] "Generic (PLEG): container finished" podID="11617628-2ff6-4464-91f0-d0198019356e" containerID="9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3" exitCode=0 Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.207321 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z54jv" event={"ID":"11617628-2ff6-4464-91f0-d0198019356e","Type":"ContainerDied","Data":"9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3"} Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.207568 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z54jv" event={"ID":"11617628-2ff6-4464-91f0-d0198019356e","Type":"ContainerDied","Data":"6bcff43f60f9fb6b7d0930e75191832585fb2a36e4fdb5dc132bde0a1b9a5861"} Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.207390 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z54jv" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.207589 4750 scope.go:117] "RemoveContainer" containerID="9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.234051 4750 scope.go:117] "RemoveContainer" containerID="ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.262519 4750 scope.go:117] "RemoveContainer" containerID="cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.284410 4750 scope.go:117] "RemoveContainer" containerID="9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3" Oct 08 18:49:19 crc kubenswrapper[4750]: E1008 18:49:19.284970 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3\": container with ID starting with 9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3 not found: ID does not exist" containerID="9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.285017 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3"} err="failed to get container status \"9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3\": rpc error: code = NotFound desc = could not find container \"9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3\": container with ID starting with 9893ee7dcff6f1417a4ab94873b52735e67ac6ad33dd195bf901e7690880c8f3 not found: ID does not exist" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.285048 4750 scope.go:117] "RemoveContainer" containerID="ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3" Oct 08 18:49:19 crc kubenswrapper[4750]: E1008 18:49:19.285409 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3\": container with ID starting with ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3 not found: ID does not exist" containerID="ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.285441 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3"} err="failed to get container status \"ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3\": rpc error: code = NotFound desc = could not find container \"ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3\": container with ID starting with ff140df1724bf21133848d71bb843b9caf5954bcec522beef0e2da0b1320fcb3 not found: ID does not exist" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.285464 4750 scope.go:117] "RemoveContainer" containerID="cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca" Oct 08 18:49:19 crc kubenswrapper[4750]: E1008 18:49:19.285955 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca\": container with ID starting with cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca not found: ID does not exist" containerID="cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.285977 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca"} err="failed to get container status \"cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca\": rpc error: code = NotFound desc = could not find container \"cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca\": container with ID starting with cd142ffa5e91e2d63c3a695040fb0beb66d51faf6205e248717e719e7e9779ca not found: ID does not exist" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.870527 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11617628-2ff6-4464-91f0-d0198019356e" (UID: "11617628-2ff6-4464-91f0-d0198019356e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:49:19 crc kubenswrapper[4750]: I1008 18:49:19.919762 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11617628-2ff6-4464-91f0-d0198019356e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:49:20 crc kubenswrapper[4750]: I1008 18:49:20.152796 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z54jv"] Oct 08 18:49:20 crc kubenswrapper[4750]: I1008 18:49:20.158745 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z54jv"] Oct 08 18:49:20 crc kubenswrapper[4750]: I1008 18:49:20.743766 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11617628-2ff6-4464-91f0-d0198019356e" path="/var/lib/kubelet/pods/11617628-2ff6-4464-91f0-d0198019356e/volumes" Oct 08 18:49:29 crc kubenswrapper[4750]: I1008 18:49:29.734423 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:49:29 crc kubenswrapper[4750]: E1008 18:49:29.735203 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:49:44 crc kubenswrapper[4750]: I1008 18:49:44.740491 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:49:44 crc kubenswrapper[4750]: E1008 18:49:44.741423 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:49:55 crc kubenswrapper[4750]: I1008 18:49:55.735095 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:49:55 crc kubenswrapper[4750]: E1008 18:49:55.735856 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:50:06 crc kubenswrapper[4750]: I1008 18:50:06.734491 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:50:06 crc kubenswrapper[4750]: E1008 18:50:06.735381 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:50:20 crc kubenswrapper[4750]: I1008 18:50:20.733960 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:50:20 crc kubenswrapper[4750]: E1008 18:50:20.734785 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:50:32 crc kubenswrapper[4750]: I1008 18:50:32.734171 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:50:32 crc kubenswrapper[4750]: E1008 18:50:32.735147 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:50:43 crc kubenswrapper[4750]: I1008 18:50:43.734854 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:50:43 crc kubenswrapper[4750]: E1008 18:50:43.735620 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:50:57 crc kubenswrapper[4750]: I1008 18:50:57.734447 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:50:57 crc kubenswrapper[4750]: E1008 18:50:57.735229 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:51:12 crc kubenswrapper[4750]: I1008 18:51:12.734109 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:51:12 crc kubenswrapper[4750]: E1008 18:51:12.734875 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:51:27 crc kubenswrapper[4750]: I1008 18:51:27.734453 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:51:27 crc kubenswrapper[4750]: E1008 18:51:27.735286 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:51:40 crc kubenswrapper[4750]: I1008 18:51:40.735382 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:51:40 crc kubenswrapper[4750]: E1008 18:51:40.736523 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:51:54 crc kubenswrapper[4750]: I1008 18:51:54.745303 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:51:54 crc kubenswrapper[4750]: E1008 18:51:54.746274 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:52:08 crc kubenswrapper[4750]: I1008 18:52:08.734756 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:52:08 crc kubenswrapper[4750]: E1008 18:52:08.735543 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:52:23 crc kubenswrapper[4750]: I1008 18:52:23.734195 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:52:23 crc kubenswrapper[4750]: E1008 18:52:23.735067 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:52:34 crc kubenswrapper[4750]: I1008 18:52:34.739955 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:52:34 crc kubenswrapper[4750]: E1008 18:52:34.740788 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:52:45 crc kubenswrapper[4750]: I1008 18:52:45.734387 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:52:45 crc kubenswrapper[4750]: E1008 18:52:45.735138 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:52:59 crc kubenswrapper[4750]: I1008 18:52:59.733670 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:52:59 crc kubenswrapper[4750]: E1008 18:52:59.734335 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:53:14 crc kubenswrapper[4750]: I1008 18:53:14.737596 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:53:14 crc kubenswrapper[4750]: E1008 18:53:14.738383 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:53:27 crc kubenswrapper[4750]: I1008 18:53:27.735404 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:53:27 crc kubenswrapper[4750]: E1008 18:53:27.736838 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 18:53:38 crc kubenswrapper[4750]: I1008 18:53:38.734195 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:53:39 crc kubenswrapper[4750]: I1008 18:53:39.276039 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"0e54b2391b337111e0ac2d5ea10feb3ef7df04ab9b58e7f0d07862a6a559a504"} Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.540135 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rp2pl"] Oct 08 18:54:33 crc kubenswrapper[4750]: E1008 18:54:33.540975 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" containerName="registry-server" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.540988 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" containerName="registry-server" Oct 08 18:54:33 crc kubenswrapper[4750]: E1008 18:54:33.541017 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" containerName="extract-utilities" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.541025 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" containerName="extract-utilities" Oct 08 18:54:33 crc kubenswrapper[4750]: E1008 18:54:33.541033 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11617628-2ff6-4464-91f0-d0198019356e" containerName="extract-utilities" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.541039 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="11617628-2ff6-4464-91f0-d0198019356e" containerName="extract-utilities" Oct 08 18:54:33 crc kubenswrapper[4750]: E1008 18:54:33.541049 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" containerName="extract-content" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.541054 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" containerName="extract-content" Oct 08 18:54:33 crc kubenswrapper[4750]: E1008 18:54:33.541066 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11617628-2ff6-4464-91f0-d0198019356e" containerName="extract-content" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.541072 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="11617628-2ff6-4464-91f0-d0198019356e" containerName="extract-content" Oct 08 18:54:33 crc kubenswrapper[4750]: E1008 18:54:33.541079 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11617628-2ff6-4464-91f0-d0198019356e" containerName="registry-server" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.541085 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="11617628-2ff6-4464-91f0-d0198019356e" containerName="registry-server" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.541202 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="11617628-2ff6-4464-91f0-d0198019356e" containerName="registry-server" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.541214 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="0885f459-e94d-47d5-ad13-0494430a15ad" containerName="registry-server" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.542140 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.557669 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rp2pl"] Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.664620 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-catalog-content\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.664709 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sml7\" (UniqueName: \"kubernetes.io/projected/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-kube-api-access-4sml7\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.664772 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-utilities\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.765902 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-utilities\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.766202 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-catalog-content\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.766336 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sml7\" (UniqueName: \"kubernetes.io/projected/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-kube-api-access-4sml7\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.766835 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-utilities\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.766872 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-catalog-content\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.787493 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sml7\" (UniqueName: \"kubernetes.io/projected/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-kube-api-access-4sml7\") pod \"community-operators-rp2pl\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:33 crc kubenswrapper[4750]: I1008 18:54:33.859640 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:34 crc kubenswrapper[4750]: I1008 18:54:34.333432 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rp2pl"] Oct 08 18:54:34 crc kubenswrapper[4750]: W1008 18:54:34.342110 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd623e1_c5f0_4818_bc1f_1466dbe91c32.slice/crio-27f6f5762d6886d7e291413fbb470cf1367c75f6aa56477d4668061c87fb5c50 WatchSource:0}: Error finding container 27f6f5762d6886d7e291413fbb470cf1367c75f6aa56477d4668061c87fb5c50: Status 404 returned error can't find the container with id 27f6f5762d6886d7e291413fbb470cf1367c75f6aa56477d4668061c87fb5c50 Oct 08 18:54:34 crc kubenswrapper[4750]: I1008 18:54:34.702018 4750 generic.go:334] "Generic (PLEG): container finished" podID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerID="61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29" exitCode=0 Oct 08 18:54:34 crc kubenswrapper[4750]: I1008 18:54:34.702114 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp2pl" event={"ID":"5dd623e1-c5f0-4818-bc1f-1466dbe91c32","Type":"ContainerDied","Data":"61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29"} Oct 08 18:54:34 crc kubenswrapper[4750]: I1008 18:54:34.702419 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp2pl" event={"ID":"5dd623e1-c5f0-4818-bc1f-1466dbe91c32","Type":"ContainerStarted","Data":"27f6f5762d6886d7e291413fbb470cf1367c75f6aa56477d4668061c87fb5c50"} Oct 08 18:54:34 crc kubenswrapper[4750]: I1008 18:54:34.706011 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 18:54:35 crc kubenswrapper[4750]: I1008 18:54:35.711934 4750 generic.go:334] "Generic (PLEG): container finished" podID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerID="49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138" exitCode=0 Oct 08 18:54:35 crc kubenswrapper[4750]: I1008 18:54:35.711986 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp2pl" event={"ID":"5dd623e1-c5f0-4818-bc1f-1466dbe91c32","Type":"ContainerDied","Data":"49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138"} Oct 08 18:54:36 crc kubenswrapper[4750]: I1008 18:54:36.721307 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp2pl" event={"ID":"5dd623e1-c5f0-4818-bc1f-1466dbe91c32","Type":"ContainerStarted","Data":"58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65"} Oct 08 18:54:36 crc kubenswrapper[4750]: I1008 18:54:36.750485 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rp2pl" podStartSLOduration=2.3453318960000002 podStartE2EDuration="3.750459424s" podCreationTimestamp="2025-10-08 18:54:33 +0000 UTC" firstStartedPulling="2025-10-08 18:54:34.705815335 +0000 UTC m=+2630.618786348" lastFinishedPulling="2025-10-08 18:54:36.110942863 +0000 UTC m=+2632.023913876" observedRunningTime="2025-10-08 18:54:36.739057285 +0000 UTC m=+2632.652028298" watchObservedRunningTime="2025-10-08 18:54:36.750459424 +0000 UTC m=+2632.663430467" Oct 08 18:54:43 crc kubenswrapper[4750]: I1008 18:54:43.860309 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:43 crc kubenswrapper[4750]: I1008 18:54:43.860940 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:43 crc kubenswrapper[4750]: I1008 18:54:43.910881 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:44 crc kubenswrapper[4750]: I1008 18:54:44.849057 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:44 crc kubenswrapper[4750]: I1008 18:54:44.901020 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rp2pl"] Oct 08 18:54:46 crc kubenswrapper[4750]: I1008 18:54:46.810908 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rp2pl" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerName="registry-server" containerID="cri-o://58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65" gracePeriod=2 Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.253333 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.373007 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-utilities\") pod \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.373101 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sml7\" (UniqueName: \"kubernetes.io/projected/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-kube-api-access-4sml7\") pod \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.373144 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-catalog-content\") pod \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\" (UID: \"5dd623e1-c5f0-4818-bc1f-1466dbe91c32\") " Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.374007 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-utilities" (OuterVolumeSpecName: "utilities") pod "5dd623e1-c5f0-4818-bc1f-1466dbe91c32" (UID: "5dd623e1-c5f0-4818-bc1f-1466dbe91c32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.378298 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-kube-api-access-4sml7" (OuterVolumeSpecName: "kube-api-access-4sml7") pod "5dd623e1-c5f0-4818-bc1f-1466dbe91c32" (UID: "5dd623e1-c5f0-4818-bc1f-1466dbe91c32"). InnerVolumeSpecName "kube-api-access-4sml7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.423837 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dd623e1-c5f0-4818-bc1f-1466dbe91c32" (UID: "5dd623e1-c5f0-4818-bc1f-1466dbe91c32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.474895 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sml7\" (UniqueName: \"kubernetes.io/projected/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-kube-api-access-4sml7\") on node \"crc\" DevicePath \"\"" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.474934 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.474944 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd623e1-c5f0-4818-bc1f-1466dbe91c32-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.819260 4750 generic.go:334] "Generic (PLEG): container finished" podID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerID="58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65" exitCode=0 Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.819301 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp2pl" event={"ID":"5dd623e1-c5f0-4818-bc1f-1466dbe91c32","Type":"ContainerDied","Data":"58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65"} Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.819325 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp2pl" event={"ID":"5dd623e1-c5f0-4818-bc1f-1466dbe91c32","Type":"ContainerDied","Data":"27f6f5762d6886d7e291413fbb470cf1367c75f6aa56477d4668061c87fb5c50"} Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.819342 4750 scope.go:117] "RemoveContainer" containerID="58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.819447 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rp2pl" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.845395 4750 scope.go:117] "RemoveContainer" containerID="49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.855911 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rp2pl"] Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.861190 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rp2pl"] Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.862991 4750 scope.go:117] "RemoveContainer" containerID="61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.884997 4750 scope.go:117] "RemoveContainer" containerID="58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65" Oct 08 18:54:47 crc kubenswrapper[4750]: E1008 18:54:47.885490 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65\": container with ID starting with 58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65 not found: ID does not exist" containerID="58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.885590 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65"} err="failed to get container status \"58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65\": rpc error: code = NotFound desc = could not find container \"58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65\": container with ID starting with 58f7025e6f58c2b770a6e6cb146423a43db2c00eab25aa8f29a97aade8c6af65 not found: ID does not exist" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.885620 4750 scope.go:117] "RemoveContainer" containerID="49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138" Oct 08 18:54:47 crc kubenswrapper[4750]: E1008 18:54:47.886013 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138\": container with ID starting with 49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138 not found: ID does not exist" containerID="49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.886162 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138"} err="failed to get container status \"49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138\": rpc error: code = NotFound desc = could not find container \"49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138\": container with ID starting with 49b5582873715a20f8a22755008c558de3f57c3fd157410a8d5a0190948e3138 not found: ID does not exist" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.886187 4750 scope.go:117] "RemoveContainer" containerID="61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29" Oct 08 18:54:47 crc kubenswrapper[4750]: E1008 18:54:47.886453 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29\": container with ID starting with 61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29 not found: ID does not exist" containerID="61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29" Oct 08 18:54:47 crc kubenswrapper[4750]: I1008 18:54:47.886473 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29"} err="failed to get container status \"61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29\": rpc error: code = NotFound desc = could not find container \"61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29\": container with ID starting with 61c652276d65fea97c126968f194e7c4f5829921f98e63e11f1ece0bf356ab29 not found: ID does not exist" Oct 08 18:54:48 crc kubenswrapper[4750]: I1008 18:54:48.751350 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" path="/var/lib/kubelet/pods/5dd623e1-c5f0-4818-bc1f-1466dbe91c32/volumes" Oct 08 18:55:59 crc kubenswrapper[4750]: I1008 18:55:59.707305 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:55:59 crc kubenswrapper[4750]: I1008 18:55:59.708222 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:56:29 crc kubenswrapper[4750]: I1008 18:56:29.707424 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:56:29 crc kubenswrapper[4750]: I1008 18:56:29.707970 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.346515 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gkpjq"] Oct 08 18:56:43 crc kubenswrapper[4750]: E1008 18:56:43.347324 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerName="extract-utilities" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.347335 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerName="extract-utilities" Oct 08 18:56:43 crc kubenswrapper[4750]: E1008 18:56:43.347349 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerName="registry-server" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.347355 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerName="registry-server" Oct 08 18:56:43 crc kubenswrapper[4750]: E1008 18:56:43.347362 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerName="extract-content" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.347368 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerName="extract-content" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.347508 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd623e1-c5f0-4818-bc1f-1466dbe91c32" containerName="registry-server" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.348610 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.358825 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkpjq"] Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.504959 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwcch\" (UniqueName: \"kubernetes.io/projected/7727d47a-ef27-45bd-9dff-e2d395d96928-kube-api-access-hwcch\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.505011 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-utilities\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.505091 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-catalog-content\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.606592 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwcch\" (UniqueName: \"kubernetes.io/projected/7727d47a-ef27-45bd-9dff-e2d395d96928-kube-api-access-hwcch\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.606645 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-utilities\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.606732 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-catalog-content\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.607304 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-utilities\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.607305 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-catalog-content\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.631478 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwcch\" (UniqueName: \"kubernetes.io/projected/7727d47a-ef27-45bd-9dff-e2d395d96928-kube-api-access-hwcch\") pod \"certified-operators-gkpjq\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.672173 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:43 crc kubenswrapper[4750]: I1008 18:56:43.931753 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkpjq"] Oct 08 18:56:44 crc kubenswrapper[4750]: I1008 18:56:44.700208 4750 generic.go:334] "Generic (PLEG): container finished" podID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerID="db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a" exitCode=0 Oct 08 18:56:44 crc kubenswrapper[4750]: I1008 18:56:44.700315 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkpjq" event={"ID":"7727d47a-ef27-45bd-9dff-e2d395d96928","Type":"ContainerDied","Data":"db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a"} Oct 08 18:56:44 crc kubenswrapper[4750]: I1008 18:56:44.700483 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkpjq" event={"ID":"7727d47a-ef27-45bd-9dff-e2d395d96928","Type":"ContainerStarted","Data":"37f1ef642feeb3131d71392bac611a87bd570cee87d7190f14c7f405eb735236"} Oct 08 18:56:46 crc kubenswrapper[4750]: I1008 18:56:46.715445 4750 generic.go:334] "Generic (PLEG): container finished" podID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerID="0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d" exitCode=0 Oct 08 18:56:46 crc kubenswrapper[4750]: I1008 18:56:46.715517 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkpjq" event={"ID":"7727d47a-ef27-45bd-9dff-e2d395d96928","Type":"ContainerDied","Data":"0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d"} Oct 08 18:56:47 crc kubenswrapper[4750]: I1008 18:56:47.727868 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkpjq" event={"ID":"7727d47a-ef27-45bd-9dff-e2d395d96928","Type":"ContainerStarted","Data":"5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57"} Oct 08 18:56:47 crc kubenswrapper[4750]: I1008 18:56:47.750249 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gkpjq" podStartSLOduration=2.07476123 podStartE2EDuration="4.750234594s" podCreationTimestamp="2025-10-08 18:56:43 +0000 UTC" firstStartedPulling="2025-10-08 18:56:44.701851708 +0000 UTC m=+2760.614822711" lastFinishedPulling="2025-10-08 18:56:47.377325062 +0000 UTC m=+2763.290296075" observedRunningTime="2025-10-08 18:56:47.744497525 +0000 UTC m=+2763.657468558" watchObservedRunningTime="2025-10-08 18:56:47.750234594 +0000 UTC m=+2763.663205607" Oct 08 18:56:53 crc kubenswrapper[4750]: I1008 18:56:53.672692 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:53 crc kubenswrapper[4750]: I1008 18:56:53.673843 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:53 crc kubenswrapper[4750]: I1008 18:56:53.766066 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:53 crc kubenswrapper[4750]: I1008 18:56:53.817434 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:53 crc kubenswrapper[4750]: I1008 18:56:53.999262 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkpjq"] Oct 08 18:56:55 crc kubenswrapper[4750]: I1008 18:56:55.787282 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gkpjq" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerName="registry-server" containerID="cri-o://5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57" gracePeriod=2 Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.171747 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.279167 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-utilities\") pod \"7727d47a-ef27-45bd-9dff-e2d395d96928\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.279271 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-catalog-content\") pod \"7727d47a-ef27-45bd-9dff-e2d395d96928\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.279307 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwcch\" (UniqueName: \"kubernetes.io/projected/7727d47a-ef27-45bd-9dff-e2d395d96928-kube-api-access-hwcch\") pod \"7727d47a-ef27-45bd-9dff-e2d395d96928\" (UID: \"7727d47a-ef27-45bd-9dff-e2d395d96928\") " Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.280888 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-utilities" (OuterVolumeSpecName: "utilities") pod "7727d47a-ef27-45bd-9dff-e2d395d96928" (UID: "7727d47a-ef27-45bd-9dff-e2d395d96928"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.284901 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7727d47a-ef27-45bd-9dff-e2d395d96928-kube-api-access-hwcch" (OuterVolumeSpecName: "kube-api-access-hwcch") pod "7727d47a-ef27-45bd-9dff-e2d395d96928" (UID: "7727d47a-ef27-45bd-9dff-e2d395d96928"). InnerVolumeSpecName "kube-api-access-hwcch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.331336 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7727d47a-ef27-45bd-9dff-e2d395d96928" (UID: "7727d47a-ef27-45bd-9dff-e2d395d96928"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.381415 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwcch\" (UniqueName: \"kubernetes.io/projected/7727d47a-ef27-45bd-9dff-e2d395d96928-kube-api-access-hwcch\") on node \"crc\" DevicePath \"\"" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.381447 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.381457 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7727d47a-ef27-45bd-9dff-e2d395d96928-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.798306 4750 generic.go:334] "Generic (PLEG): container finished" podID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerID="5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57" exitCode=0 Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.798358 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkpjq" event={"ID":"7727d47a-ef27-45bd-9dff-e2d395d96928","Type":"ContainerDied","Data":"5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57"} Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.798379 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkpjq" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.798393 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkpjq" event={"ID":"7727d47a-ef27-45bd-9dff-e2d395d96928","Type":"ContainerDied","Data":"37f1ef642feeb3131d71392bac611a87bd570cee87d7190f14c7f405eb735236"} Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.798416 4750 scope.go:117] "RemoveContainer" containerID="5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.823351 4750 scope.go:117] "RemoveContainer" containerID="0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.832374 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkpjq"] Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.841766 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gkpjq"] Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.850272 4750 scope.go:117] "RemoveContainer" containerID="db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.865533 4750 scope.go:117] "RemoveContainer" containerID="5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57" Oct 08 18:56:56 crc kubenswrapper[4750]: E1008 18:56:56.865983 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57\": container with ID starting with 5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57 not found: ID does not exist" containerID="5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.866029 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57"} err="failed to get container status \"5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57\": rpc error: code = NotFound desc = could not find container \"5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57\": container with ID starting with 5a9f3420a8acf370c08a5680a7e16b206b9b1bc85da34bdcb2c976ecad89db57 not found: ID does not exist" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.866057 4750 scope.go:117] "RemoveContainer" containerID="0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d" Oct 08 18:56:56 crc kubenswrapper[4750]: E1008 18:56:56.866497 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d\": container with ID starting with 0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d not found: ID does not exist" containerID="0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.866540 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d"} err="failed to get container status \"0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d\": rpc error: code = NotFound desc = could not find container \"0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d\": container with ID starting with 0112918b6e07e336b96aef839e6f48a6708126607c5ee27f68505ae0dacf809d not found: ID does not exist" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.866585 4750 scope.go:117] "RemoveContainer" containerID="db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a" Oct 08 18:56:56 crc kubenswrapper[4750]: E1008 18:56:56.867654 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a\": container with ID starting with db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a not found: ID does not exist" containerID="db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a" Oct 08 18:56:56 crc kubenswrapper[4750]: I1008 18:56:56.867688 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a"} err="failed to get container status \"db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a\": rpc error: code = NotFound desc = could not find container \"db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a\": container with ID starting with db601949022b61e4fafe5ee693c4a2c1997ba760e5a2476bdd6e567a3948cc3a not found: ID does not exist" Oct 08 18:56:58 crc kubenswrapper[4750]: I1008 18:56:58.747054 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" path="/var/lib/kubelet/pods/7727d47a-ef27-45bd-9dff-e2d395d96928/volumes" Oct 08 18:56:59 crc kubenswrapper[4750]: I1008 18:56:59.706584 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:56:59 crc kubenswrapper[4750]: I1008 18:56:59.706646 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:56:59 crc kubenswrapper[4750]: I1008 18:56:59.706687 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:56:59 crc kubenswrapper[4750]: I1008 18:56:59.707171 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e54b2391b337111e0ac2d5ea10feb3ef7df04ab9b58e7f0d07862a6a559a504"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:56:59 crc kubenswrapper[4750]: I1008 18:56:59.707227 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://0e54b2391b337111e0ac2d5ea10feb3ef7df04ab9b58e7f0d07862a6a559a504" gracePeriod=600 Oct 08 18:57:00 crc kubenswrapper[4750]: I1008 18:57:00.831155 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="0e54b2391b337111e0ac2d5ea10feb3ef7df04ab9b58e7f0d07862a6a559a504" exitCode=0 Oct 08 18:57:00 crc kubenswrapper[4750]: I1008 18:57:00.831620 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"0e54b2391b337111e0ac2d5ea10feb3ef7df04ab9b58e7f0d07862a6a559a504"} Oct 08 18:57:00 crc kubenswrapper[4750]: I1008 18:57:00.831735 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea"} Oct 08 18:57:00 crc kubenswrapper[4750]: I1008 18:57:00.831759 4750 scope.go:117] "RemoveContainer" containerID="774c3fdb5318675acb98c76123aa59a32a454398260e24c10294ce6aad824331" Oct 08 18:58:59 crc kubenswrapper[4750]: I1008 18:58:59.706785 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:58:59 crc kubenswrapper[4750]: I1008 18:58:59.707300 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.084598 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s8g5g"] Oct 08 18:59:06 crc kubenswrapper[4750]: E1008 18:59:06.085287 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerName="extract-utilities" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.085301 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerName="extract-utilities" Oct 08 18:59:06 crc kubenswrapper[4750]: E1008 18:59:06.085323 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerName="registry-server" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.085330 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerName="registry-server" Oct 08 18:59:06 crc kubenswrapper[4750]: E1008 18:59:06.085348 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerName="extract-content" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.085354 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerName="extract-content" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.085475 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7727d47a-ef27-45bd-9dff-e2d395d96928" containerName="registry-server" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.086611 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.138324 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8g5g"] Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.211520 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wp88\" (UniqueName: \"kubernetes.io/projected/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-kube-api-access-9wp88\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.211592 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-catalog-content\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.211789 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-utilities\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.290944 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p7cgs"] Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.292728 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.302466 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7cgs"] Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.313305 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wp88\" (UniqueName: \"kubernetes.io/projected/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-kube-api-access-9wp88\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.313352 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-catalog-content\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.313425 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-utilities\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.314010 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-utilities\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.314562 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-catalog-content\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.336633 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wp88\" (UniqueName: \"kubernetes.io/projected/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-kube-api-access-9wp88\") pod \"redhat-marketplace-s8g5g\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.408060 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.415762 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-catalog-content\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.415813 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-utilities\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.415848 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svk4b\" (UniqueName: \"kubernetes.io/projected/4082c546-c63e-4711-a789-f4695bf3000d-kube-api-access-svk4b\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.516743 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-catalog-content\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.517053 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-utilities\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.517107 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svk4b\" (UniqueName: \"kubernetes.io/projected/4082c546-c63e-4711-a789-f4695bf3000d-kube-api-access-svk4b\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.517667 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-catalog-content\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.517876 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-utilities\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.538737 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svk4b\" (UniqueName: \"kubernetes.io/projected/4082c546-c63e-4711-a789-f4695bf3000d-kube-api-access-svk4b\") pod \"redhat-operators-p7cgs\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.608623 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:06 crc kubenswrapper[4750]: I1008 18:59:06.838831 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8g5g"] Oct 08 18:59:07 crc kubenswrapper[4750]: I1008 18:59:07.051335 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7cgs"] Oct 08 18:59:07 crc kubenswrapper[4750]: W1008 18:59:07.101473 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4082c546_c63e_4711_a789_f4695bf3000d.slice/crio-852caa8cea880badf69803d12ce459cf5695d49415e9621de47882bdd0f916c4 WatchSource:0}: Error finding container 852caa8cea880badf69803d12ce459cf5695d49415e9621de47882bdd0f916c4: Status 404 returned error can't find the container with id 852caa8cea880badf69803d12ce459cf5695d49415e9621de47882bdd0f916c4 Oct 08 18:59:07 crc kubenswrapper[4750]: I1008 18:59:07.749009 4750 generic.go:334] "Generic (PLEG): container finished" podID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerID="9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db" exitCode=0 Oct 08 18:59:07 crc kubenswrapper[4750]: I1008 18:59:07.749125 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8g5g" event={"ID":"bb93704e-716a-4fcb-bd76-c774cf4f5bbc","Type":"ContainerDied","Data":"9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db"} Oct 08 18:59:07 crc kubenswrapper[4750]: I1008 18:59:07.749158 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8g5g" event={"ID":"bb93704e-716a-4fcb-bd76-c774cf4f5bbc","Type":"ContainerStarted","Data":"2ddfb93b483313705e44d2c39c01f3c549065067b9add09eac9400b77c80da1e"} Oct 08 18:59:07 crc kubenswrapper[4750]: I1008 18:59:07.750687 4750 generic.go:334] "Generic (PLEG): container finished" podID="4082c546-c63e-4711-a789-f4695bf3000d" containerID="ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856" exitCode=0 Oct 08 18:59:07 crc kubenswrapper[4750]: I1008 18:59:07.750726 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7cgs" event={"ID":"4082c546-c63e-4711-a789-f4695bf3000d","Type":"ContainerDied","Data":"ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856"} Oct 08 18:59:07 crc kubenswrapper[4750]: I1008 18:59:07.750747 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7cgs" event={"ID":"4082c546-c63e-4711-a789-f4695bf3000d","Type":"ContainerStarted","Data":"852caa8cea880badf69803d12ce459cf5695d49415e9621de47882bdd0f916c4"} Oct 08 18:59:08 crc kubenswrapper[4750]: I1008 18:59:08.760036 4750 generic.go:334] "Generic (PLEG): container finished" podID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerID="57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b" exitCode=0 Oct 08 18:59:08 crc kubenswrapper[4750]: I1008 18:59:08.760285 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8g5g" event={"ID":"bb93704e-716a-4fcb-bd76-c774cf4f5bbc","Type":"ContainerDied","Data":"57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b"} Oct 08 18:59:08 crc kubenswrapper[4750]: I1008 18:59:08.763101 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7cgs" event={"ID":"4082c546-c63e-4711-a789-f4695bf3000d","Type":"ContainerStarted","Data":"e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5"} Oct 08 18:59:09 crc kubenswrapper[4750]: I1008 18:59:09.771731 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8g5g" event={"ID":"bb93704e-716a-4fcb-bd76-c774cf4f5bbc","Type":"ContainerStarted","Data":"f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa"} Oct 08 18:59:09 crc kubenswrapper[4750]: I1008 18:59:09.775592 4750 generic.go:334] "Generic (PLEG): container finished" podID="4082c546-c63e-4711-a789-f4695bf3000d" containerID="e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5" exitCode=0 Oct 08 18:59:09 crc kubenswrapper[4750]: I1008 18:59:09.775621 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7cgs" event={"ID":"4082c546-c63e-4711-a789-f4695bf3000d","Type":"ContainerDied","Data":"e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5"} Oct 08 18:59:09 crc kubenswrapper[4750]: I1008 18:59:09.793926 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s8g5g" podStartSLOduration=2.210160594 podStartE2EDuration="3.793908579s" podCreationTimestamp="2025-10-08 18:59:06 +0000 UTC" firstStartedPulling="2025-10-08 18:59:07.750903266 +0000 UTC m=+2903.663874279" lastFinishedPulling="2025-10-08 18:59:09.334651251 +0000 UTC m=+2905.247622264" observedRunningTime="2025-10-08 18:59:09.78942939 +0000 UTC m=+2905.702400423" watchObservedRunningTime="2025-10-08 18:59:09.793908579 +0000 UTC m=+2905.706879592" Oct 08 18:59:10 crc kubenswrapper[4750]: I1008 18:59:10.784040 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7cgs" event={"ID":"4082c546-c63e-4711-a789-f4695bf3000d","Type":"ContainerStarted","Data":"d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb"} Oct 08 18:59:10 crc kubenswrapper[4750]: I1008 18:59:10.805601 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p7cgs" podStartSLOduration=2.338788549 podStartE2EDuration="4.805582255s" podCreationTimestamp="2025-10-08 18:59:06 +0000 UTC" firstStartedPulling="2025-10-08 18:59:07.751843158 +0000 UTC m=+2903.664814171" lastFinishedPulling="2025-10-08 18:59:10.218636864 +0000 UTC m=+2906.131607877" observedRunningTime="2025-10-08 18:59:10.798040471 +0000 UTC m=+2906.711011484" watchObservedRunningTime="2025-10-08 18:59:10.805582255 +0000 UTC m=+2906.718553278" Oct 08 18:59:16 crc kubenswrapper[4750]: I1008 18:59:16.409004 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:16 crc kubenswrapper[4750]: I1008 18:59:16.409525 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:16 crc kubenswrapper[4750]: I1008 18:59:16.456752 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:16 crc kubenswrapper[4750]: I1008 18:59:16.609832 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:16 crc kubenswrapper[4750]: I1008 18:59:16.609900 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:16 crc kubenswrapper[4750]: I1008 18:59:16.681178 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:16 crc kubenswrapper[4750]: I1008 18:59:16.874778 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:16 crc kubenswrapper[4750]: I1008 18:59:16.884785 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:18 crc kubenswrapper[4750]: I1008 18:59:18.674271 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7cgs"] Oct 08 18:59:18 crc kubenswrapper[4750]: I1008 18:59:18.847034 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p7cgs" podUID="4082c546-c63e-4711-a789-f4695bf3000d" containerName="registry-server" containerID="cri-o://d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb" gracePeriod=2 Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.214259 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.275096 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8g5g"] Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.275331 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s8g5g" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerName="registry-server" containerID="cri-o://f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa" gracePeriod=2 Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.408681 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svk4b\" (UniqueName: \"kubernetes.io/projected/4082c546-c63e-4711-a789-f4695bf3000d-kube-api-access-svk4b\") pod \"4082c546-c63e-4711-a789-f4695bf3000d\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.408884 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-utilities\") pod \"4082c546-c63e-4711-a789-f4695bf3000d\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.408940 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-catalog-content\") pod \"4082c546-c63e-4711-a789-f4695bf3000d\" (UID: \"4082c546-c63e-4711-a789-f4695bf3000d\") " Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.410168 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-utilities" (OuterVolumeSpecName: "utilities") pod "4082c546-c63e-4711-a789-f4695bf3000d" (UID: "4082c546-c63e-4711-a789-f4695bf3000d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.416074 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4082c546-c63e-4711-a789-f4695bf3000d-kube-api-access-svk4b" (OuterVolumeSpecName: "kube-api-access-svk4b") pod "4082c546-c63e-4711-a789-f4695bf3000d" (UID: "4082c546-c63e-4711-a789-f4695bf3000d"). InnerVolumeSpecName "kube-api-access-svk4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.511585 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.511646 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svk4b\" (UniqueName: \"kubernetes.io/projected/4082c546-c63e-4711-a789-f4695bf3000d-kube-api-access-svk4b\") on node \"crc\" DevicePath \"\"" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.513993 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4082c546-c63e-4711-a789-f4695bf3000d" (UID: "4082c546-c63e-4711-a789-f4695bf3000d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.603700 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.613433 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4082c546-c63e-4711-a789-f4695bf3000d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.714142 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-utilities\") pod \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.714207 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-catalog-content\") pod \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.714379 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wp88\" (UniqueName: \"kubernetes.io/projected/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-kube-api-access-9wp88\") pod \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\" (UID: \"bb93704e-716a-4fcb-bd76-c774cf4f5bbc\") " Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.715343 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-utilities" (OuterVolumeSpecName: "utilities") pod "bb93704e-716a-4fcb-bd76-c774cf4f5bbc" (UID: "bb93704e-716a-4fcb-bd76-c774cf4f5bbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.717453 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-kube-api-access-9wp88" (OuterVolumeSpecName: "kube-api-access-9wp88") pod "bb93704e-716a-4fcb-bd76-c774cf4f5bbc" (UID: "bb93704e-716a-4fcb-bd76-c774cf4f5bbc"). InnerVolumeSpecName "kube-api-access-9wp88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.730868 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb93704e-716a-4fcb-bd76-c774cf4f5bbc" (UID: "bb93704e-716a-4fcb-bd76-c774cf4f5bbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.815536 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wp88\" (UniqueName: \"kubernetes.io/projected/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-kube-api-access-9wp88\") on node \"crc\" DevicePath \"\"" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.815580 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.815601 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93704e-716a-4fcb-bd76-c774cf4f5bbc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.857708 4750 generic.go:334] "Generic (PLEG): container finished" podID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerID="f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa" exitCode=0 Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.857803 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8g5g" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.857798 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8g5g" event={"ID":"bb93704e-716a-4fcb-bd76-c774cf4f5bbc","Type":"ContainerDied","Data":"f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa"} Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.858106 4750 scope.go:117] "RemoveContainer" containerID="f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.867748 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8g5g" event={"ID":"bb93704e-716a-4fcb-bd76-c774cf4f5bbc","Type":"ContainerDied","Data":"2ddfb93b483313705e44d2c39c01f3c549065067b9add09eac9400b77c80da1e"} Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.870835 4750 generic.go:334] "Generic (PLEG): container finished" podID="4082c546-c63e-4711-a789-f4695bf3000d" containerID="d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb" exitCode=0 Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.870886 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7cgs" event={"ID":"4082c546-c63e-4711-a789-f4695bf3000d","Type":"ContainerDied","Data":"d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb"} Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.870910 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7cgs" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.870917 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7cgs" event={"ID":"4082c546-c63e-4711-a789-f4695bf3000d","Type":"ContainerDied","Data":"852caa8cea880badf69803d12ce459cf5695d49415e9621de47882bdd0f916c4"} Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.875595 4750 scope.go:117] "RemoveContainer" containerID="57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.898937 4750 scope.go:117] "RemoveContainer" containerID="9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.922103 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8g5g"] Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.933336 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8g5g"] Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.938859 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7cgs"] Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.939146 4750 scope.go:117] "RemoveContainer" containerID="f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa" Oct 08 18:59:19 crc kubenswrapper[4750]: E1008 18:59:19.939587 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa\": container with ID starting with f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa not found: ID does not exist" containerID="f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.939618 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa"} err="failed to get container status \"f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa\": rpc error: code = NotFound desc = could not find container \"f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa\": container with ID starting with f6118b32ff9661428fd2d4870590b094583842b05407d170d6e9ac3e710b3eaa not found: ID does not exist" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.939640 4750 scope.go:117] "RemoveContainer" containerID="57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b" Oct 08 18:59:19 crc kubenswrapper[4750]: E1008 18:59:19.939978 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b\": container with ID starting with 57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b not found: ID does not exist" containerID="57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.940010 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b"} err="failed to get container status \"57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b\": rpc error: code = NotFound desc = could not find container \"57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b\": container with ID starting with 57167de4593942d2629e5af796a32afe64e2419e515576fe6b5c892dd4ee788b not found: ID does not exist" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.940029 4750 scope.go:117] "RemoveContainer" containerID="9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db" Oct 08 18:59:19 crc kubenswrapper[4750]: E1008 18:59:19.940262 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db\": container with ID starting with 9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db not found: ID does not exist" containerID="9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.940283 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db"} err="failed to get container status \"9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db\": rpc error: code = NotFound desc = could not find container \"9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db\": container with ID starting with 9d8b9afa4557a89b0dd8ddd0f38d081a71f1956731fbcff9f5cda8fc028005db not found: ID does not exist" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.940297 4750 scope.go:117] "RemoveContainer" containerID="d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.943584 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p7cgs"] Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.953050 4750 scope.go:117] "RemoveContainer" containerID="e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5" Oct 08 18:59:19 crc kubenswrapper[4750]: I1008 18:59:19.968991 4750 scope.go:117] "RemoveContainer" containerID="ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856" Oct 08 18:59:20 crc kubenswrapper[4750]: I1008 18:59:20.009205 4750 scope.go:117] "RemoveContainer" containerID="d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb" Oct 08 18:59:20 crc kubenswrapper[4750]: E1008 18:59:20.009604 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb\": container with ID starting with d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb not found: ID does not exist" containerID="d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb" Oct 08 18:59:20 crc kubenswrapper[4750]: I1008 18:59:20.009637 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb"} err="failed to get container status \"d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb\": rpc error: code = NotFound desc = could not find container \"d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb\": container with ID starting with d743e06b7b5b6acaf7e248c2c4d4b92c7fd2ba2c5398bcdaa3e687fd475e7eeb not found: ID does not exist" Oct 08 18:59:20 crc kubenswrapper[4750]: I1008 18:59:20.009657 4750 scope.go:117] "RemoveContainer" containerID="e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5" Oct 08 18:59:20 crc kubenswrapper[4750]: E1008 18:59:20.009905 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5\": container with ID starting with e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5 not found: ID does not exist" containerID="e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5" Oct 08 18:59:20 crc kubenswrapper[4750]: I1008 18:59:20.009986 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5"} err="failed to get container status \"e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5\": rpc error: code = NotFound desc = could not find container \"e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5\": container with ID starting with e7302f049c311e73b2cfe1f6c3650a86dab3d8e4086207e43ab81efef1bdd6d5 not found: ID does not exist" Oct 08 18:59:20 crc kubenswrapper[4750]: I1008 18:59:20.010020 4750 scope.go:117] "RemoveContainer" containerID="ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856" Oct 08 18:59:20 crc kubenswrapper[4750]: E1008 18:59:20.010671 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856\": container with ID starting with ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856 not found: ID does not exist" containerID="ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856" Oct 08 18:59:20 crc kubenswrapper[4750]: I1008 18:59:20.010701 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856"} err="failed to get container status \"ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856\": rpc error: code = NotFound desc = could not find container \"ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856\": container with ID starting with ba507c460f7fe678d9be64def267ca2900af82d761eff57234f407707de63856 not found: ID does not exist" Oct 08 18:59:20 crc kubenswrapper[4750]: I1008 18:59:20.742844 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4082c546-c63e-4711-a789-f4695bf3000d" path="/var/lib/kubelet/pods/4082c546-c63e-4711-a789-f4695bf3000d/volumes" Oct 08 18:59:20 crc kubenswrapper[4750]: I1008 18:59:20.743566 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" path="/var/lib/kubelet/pods/bb93704e-716a-4fcb-bd76-c774cf4f5bbc/volumes" Oct 08 18:59:29 crc kubenswrapper[4750]: I1008 18:59:29.707629 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:59:29 crc kubenswrapper[4750]: I1008 18:59:29.708310 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:59:59 crc kubenswrapper[4750]: I1008 18:59:59.707307 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 18:59:59 crc kubenswrapper[4750]: I1008 18:59:59.707849 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 18:59:59 crc kubenswrapper[4750]: I1008 18:59:59.707900 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 18:59:59 crc kubenswrapper[4750]: I1008 18:59:59.708569 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 18:59:59 crc kubenswrapper[4750]: I1008 18:59:59.708656 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" gracePeriod=600 Oct 08 18:59:59 crc kubenswrapper[4750]: E1008 18:59:59.829108 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.181372 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" exitCode=0 Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.181434 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea"} Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.181817 4750 scope.go:117] "RemoveContainer" containerID="0e54b2391b337111e0ac2d5ea10feb3ef7df04ab9b58e7f0d07862a6a559a504" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.182443 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:00:00 crc kubenswrapper[4750]: E1008 19:00:00.182700 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185297 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq"] Oct 08 19:00:00 crc kubenswrapper[4750]: E1008 19:00:00.185636 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerName="registry-server" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185653 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerName="registry-server" Oct 08 19:00:00 crc kubenswrapper[4750]: E1008 19:00:00.185664 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4082c546-c63e-4711-a789-f4695bf3000d" containerName="registry-server" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185670 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4082c546-c63e-4711-a789-f4695bf3000d" containerName="registry-server" Oct 08 19:00:00 crc kubenswrapper[4750]: E1008 19:00:00.185683 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerName="extract-utilities" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185690 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerName="extract-utilities" Oct 08 19:00:00 crc kubenswrapper[4750]: E1008 19:00:00.185703 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4082c546-c63e-4711-a789-f4695bf3000d" containerName="extract-utilities" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185709 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4082c546-c63e-4711-a789-f4695bf3000d" containerName="extract-utilities" Oct 08 19:00:00 crc kubenswrapper[4750]: E1008 19:00:00.185738 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerName="extract-content" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185746 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerName="extract-content" Oct 08 19:00:00 crc kubenswrapper[4750]: E1008 19:00:00.185755 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4082c546-c63e-4711-a789-f4695bf3000d" containerName="extract-content" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185760 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4082c546-c63e-4711-a789-f4695bf3000d" containerName="extract-content" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185928 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb93704e-716a-4fcb-bd76-c774cf4f5bbc" containerName="registry-server" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.185940 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4082c546-c63e-4711-a789-f4695bf3000d" containerName="registry-server" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.186474 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.188687 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8befc693-d921-4d73-88f0-67fde14ff01a-config-volume\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.188774 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lsdd\" (UniqueName: \"kubernetes.io/projected/8befc693-d921-4d73-88f0-67fde14ff01a-kube-api-access-9lsdd\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.188793 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8befc693-d921-4d73-88f0-67fde14ff01a-secret-volume\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.189082 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.189490 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.195572 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq"] Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.290454 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lsdd\" (UniqueName: \"kubernetes.io/projected/8befc693-d921-4d73-88f0-67fde14ff01a-kube-api-access-9lsdd\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.290770 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8befc693-d921-4d73-88f0-67fde14ff01a-secret-volume\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.290923 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8befc693-d921-4d73-88f0-67fde14ff01a-config-volume\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.291702 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8befc693-d921-4d73-88f0-67fde14ff01a-config-volume\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.297281 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8befc693-d921-4d73-88f0-67fde14ff01a-secret-volume\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.309599 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lsdd\" (UniqueName: \"kubernetes.io/projected/8befc693-d921-4d73-88f0-67fde14ff01a-kube-api-access-9lsdd\") pod \"collect-profiles-29332500-qrbpq\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.513354 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:00 crc kubenswrapper[4750]: I1008 19:00:00.939059 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq"] Oct 08 19:00:01 crc kubenswrapper[4750]: I1008 19:00:01.189542 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" event={"ID":"8befc693-d921-4d73-88f0-67fde14ff01a","Type":"ContainerStarted","Data":"53fc4a2a8d58658dfdfbc5b400f01bbbe6f45ba1c383a52dc5a6b732549c1561"} Oct 08 19:00:01 crc kubenswrapper[4750]: I1008 19:00:01.189624 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" event={"ID":"8befc693-d921-4d73-88f0-67fde14ff01a","Type":"ContainerStarted","Data":"a11afc6e6c86ec4c829bb4156e466dd969ff298f07b4d910b6224f927d5c56e0"} Oct 08 19:00:01 crc kubenswrapper[4750]: I1008 19:00:01.214131 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" podStartSLOduration=1.214112497 podStartE2EDuration="1.214112497s" podCreationTimestamp="2025-10-08 19:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:00:01.206248484 +0000 UTC m=+2957.119219517" watchObservedRunningTime="2025-10-08 19:00:01.214112497 +0000 UTC m=+2957.127083510" Oct 08 19:00:02 crc kubenswrapper[4750]: I1008 19:00:02.219450 4750 generic.go:334] "Generic (PLEG): container finished" podID="8befc693-d921-4d73-88f0-67fde14ff01a" containerID="53fc4a2a8d58658dfdfbc5b400f01bbbe6f45ba1c383a52dc5a6b732549c1561" exitCode=0 Oct 08 19:00:02 crc kubenswrapper[4750]: I1008 19:00:02.219517 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" event={"ID":"8befc693-d921-4d73-88f0-67fde14ff01a","Type":"ContainerDied","Data":"53fc4a2a8d58658dfdfbc5b400f01bbbe6f45ba1c383a52dc5a6b732549c1561"} Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.451026 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.634629 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lsdd\" (UniqueName: \"kubernetes.io/projected/8befc693-d921-4d73-88f0-67fde14ff01a-kube-api-access-9lsdd\") pod \"8befc693-d921-4d73-88f0-67fde14ff01a\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.634736 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8befc693-d921-4d73-88f0-67fde14ff01a-secret-volume\") pod \"8befc693-d921-4d73-88f0-67fde14ff01a\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.634765 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8befc693-d921-4d73-88f0-67fde14ff01a-config-volume\") pod \"8befc693-d921-4d73-88f0-67fde14ff01a\" (UID: \"8befc693-d921-4d73-88f0-67fde14ff01a\") " Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.635602 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8befc693-d921-4d73-88f0-67fde14ff01a-config-volume" (OuterVolumeSpecName: "config-volume") pod "8befc693-d921-4d73-88f0-67fde14ff01a" (UID: "8befc693-d921-4d73-88f0-67fde14ff01a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.639669 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8befc693-d921-4d73-88f0-67fde14ff01a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8befc693-d921-4d73-88f0-67fde14ff01a" (UID: "8befc693-d921-4d73-88f0-67fde14ff01a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.643187 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8befc693-d921-4d73-88f0-67fde14ff01a-kube-api-access-9lsdd" (OuterVolumeSpecName: "kube-api-access-9lsdd") pod "8befc693-d921-4d73-88f0-67fde14ff01a" (UID: "8befc693-d921-4d73-88f0-67fde14ff01a"). InnerVolumeSpecName "kube-api-access-9lsdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.736443 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lsdd\" (UniqueName: \"kubernetes.io/projected/8befc693-d921-4d73-88f0-67fde14ff01a-kube-api-access-9lsdd\") on node \"crc\" DevicePath \"\"" Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.736479 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8befc693-d921-4d73-88f0-67fde14ff01a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 19:00:03 crc kubenswrapper[4750]: I1008 19:00:03.736488 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8befc693-d921-4d73-88f0-67fde14ff01a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 19:00:04 crc kubenswrapper[4750]: I1008 19:00:04.236778 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" event={"ID":"8befc693-d921-4d73-88f0-67fde14ff01a","Type":"ContainerDied","Data":"a11afc6e6c86ec4c829bb4156e466dd969ff298f07b4d910b6224f927d5c56e0"} Oct 08 19:00:04 crc kubenswrapper[4750]: I1008 19:00:04.236817 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11afc6e6c86ec4c829bb4156e466dd969ff298f07b4d910b6224f927d5c56e0" Oct 08 19:00:04 crc kubenswrapper[4750]: I1008 19:00:04.236842 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq" Oct 08 19:00:04 crc kubenswrapper[4750]: I1008 19:00:04.279498 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl"] Oct 08 19:00:04 crc kubenswrapper[4750]: I1008 19:00:04.285730 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332455-spsrl"] Oct 08 19:00:04 crc kubenswrapper[4750]: I1008 19:00:04.744016 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eefe351d-f07f-47f7-97aa-4733bfd0f000" path="/var/lib/kubelet/pods/eefe351d-f07f-47f7-97aa-4733bfd0f000/volumes" Oct 08 19:00:13 crc kubenswrapper[4750]: I1008 19:00:13.734180 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:00:13 crc kubenswrapper[4750]: E1008 19:00:13.735073 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:00:27 crc kubenswrapper[4750]: I1008 19:00:27.733905 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:00:27 crc kubenswrapper[4750]: E1008 19:00:27.734515 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:00:38 crc kubenswrapper[4750]: I1008 19:00:38.734105 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:00:38 crc kubenswrapper[4750]: E1008 19:00:38.734932 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:00:51 crc kubenswrapper[4750]: I1008 19:00:51.734018 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:00:51 crc kubenswrapper[4750]: E1008 19:00:51.734766 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:01:01 crc kubenswrapper[4750]: I1008 19:01:01.903798 4750 scope.go:117] "RemoveContainer" containerID="c074cde0c4c91a5fbf7c0332e7757099609651e02b1398cf79e1baeb9bb74d82" Oct 08 19:01:04 crc kubenswrapper[4750]: I1008 19:01:04.738675 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:01:04 crc kubenswrapper[4750]: E1008 19:01:04.739144 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:01:16 crc kubenswrapper[4750]: I1008 19:01:16.736365 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:01:16 crc kubenswrapper[4750]: E1008 19:01:16.737744 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:01:31 crc kubenswrapper[4750]: I1008 19:01:31.733923 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:01:31 crc kubenswrapper[4750]: E1008 19:01:31.734565 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:01:43 crc kubenswrapper[4750]: I1008 19:01:43.735266 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:01:43 crc kubenswrapper[4750]: E1008 19:01:43.736302 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:01:57 crc kubenswrapper[4750]: I1008 19:01:57.733752 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:01:57 crc kubenswrapper[4750]: E1008 19:01:57.734580 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:02:09 crc kubenswrapper[4750]: I1008 19:02:09.734089 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:02:09 crc kubenswrapper[4750]: E1008 19:02:09.734754 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:02:24 crc kubenswrapper[4750]: I1008 19:02:24.739638 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:02:24 crc kubenswrapper[4750]: E1008 19:02:24.740396 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:02:39 crc kubenswrapper[4750]: I1008 19:02:39.735855 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:02:39 crc kubenswrapper[4750]: E1008 19:02:39.737378 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:02:50 crc kubenswrapper[4750]: I1008 19:02:50.734515 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:02:50 crc kubenswrapper[4750]: E1008 19:02:50.735330 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:03:05 crc kubenswrapper[4750]: I1008 19:03:05.734249 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:03:05 crc kubenswrapper[4750]: E1008 19:03:05.735296 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:03:17 crc kubenswrapper[4750]: I1008 19:03:17.734080 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:03:17 crc kubenswrapper[4750]: E1008 19:03:17.734825 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:03:30 crc kubenswrapper[4750]: I1008 19:03:30.733868 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:03:30 crc kubenswrapper[4750]: E1008 19:03:30.734737 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:03:42 crc kubenswrapper[4750]: I1008 19:03:42.735310 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:03:42 crc kubenswrapper[4750]: E1008 19:03:42.736067 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:03:55 crc kubenswrapper[4750]: I1008 19:03:55.733982 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:03:55 crc kubenswrapper[4750]: E1008 19:03:55.734769 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:04:10 crc kubenswrapper[4750]: I1008 19:04:10.734125 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:04:10 crc kubenswrapper[4750]: E1008 19:04:10.734884 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:04:25 crc kubenswrapper[4750]: I1008 19:04:25.734970 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:04:25 crc kubenswrapper[4750]: E1008 19:04:25.735955 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:04:37 crc kubenswrapper[4750]: I1008 19:04:37.734843 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:04:37 crc kubenswrapper[4750]: E1008 19:04:37.735699 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:04:49 crc kubenswrapper[4750]: I1008 19:04:49.733979 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:04:49 crc kubenswrapper[4750]: E1008 19:04:49.734702 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:05:03 crc kubenswrapper[4750]: I1008 19:05:03.735539 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:05:04 crc kubenswrapper[4750]: I1008 19:05:04.436652 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"8614657f63c02aa8d66cea0f65177fa4035c6abef7acecdb30121d5573860c4f"} Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.005305 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4hs2s"] Oct 08 19:05:10 crc kubenswrapper[4750]: E1008 19:05:10.006147 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8befc693-d921-4d73-88f0-67fde14ff01a" containerName="collect-profiles" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.006160 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="8befc693-d921-4d73-88f0-67fde14ff01a" containerName="collect-profiles" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.006302 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="8befc693-d921-4d73-88f0-67fde14ff01a" containerName="collect-profiles" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.007313 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.030961 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hs2s"] Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.163013 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-catalog-content\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.163076 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-utilities\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.163101 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xg26\" (UniqueName: \"kubernetes.io/projected/6b1f9507-99c3-44f4-8459-a925dae2eda6-kube-api-access-7xg26\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.265574 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-utilities\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.265644 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xg26\" (UniqueName: \"kubernetes.io/projected/6b1f9507-99c3-44f4-8459-a925dae2eda6-kube-api-access-7xg26\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.265786 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-catalog-content\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.266504 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-catalog-content\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.267294 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-utilities\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.287351 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xg26\" (UniqueName: \"kubernetes.io/projected/6b1f9507-99c3-44f4-8459-a925dae2eda6-kube-api-access-7xg26\") pod \"community-operators-4hs2s\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.325886 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:10 crc kubenswrapper[4750]: I1008 19:05:10.832023 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hs2s"] Oct 08 19:05:11 crc kubenswrapper[4750]: I1008 19:05:11.514155 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerID="8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35" exitCode=0 Oct 08 19:05:11 crc kubenswrapper[4750]: I1008 19:05:11.514783 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hs2s" event={"ID":"6b1f9507-99c3-44f4-8459-a925dae2eda6","Type":"ContainerDied","Data":"8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35"} Oct 08 19:05:11 crc kubenswrapper[4750]: I1008 19:05:11.514973 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hs2s" event={"ID":"6b1f9507-99c3-44f4-8459-a925dae2eda6","Type":"ContainerStarted","Data":"003dac10b8ee0386f9992b0ff5e4f355bd9c6eea94eebb2898e46675a7d715f1"} Oct 08 19:05:11 crc kubenswrapper[4750]: I1008 19:05:11.518893 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:05:13 crc kubenswrapper[4750]: I1008 19:05:13.534154 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerID="fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1" exitCode=0 Oct 08 19:05:13 crc kubenswrapper[4750]: I1008 19:05:13.534216 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hs2s" event={"ID":"6b1f9507-99c3-44f4-8459-a925dae2eda6","Type":"ContainerDied","Data":"fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1"} Oct 08 19:05:14 crc kubenswrapper[4750]: I1008 19:05:14.543524 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hs2s" event={"ID":"6b1f9507-99c3-44f4-8459-a925dae2eda6","Type":"ContainerStarted","Data":"0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e"} Oct 08 19:05:20 crc kubenswrapper[4750]: I1008 19:05:20.326095 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:20 crc kubenswrapper[4750]: I1008 19:05:20.326616 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:20 crc kubenswrapper[4750]: I1008 19:05:20.369062 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:20 crc kubenswrapper[4750]: I1008 19:05:20.385341 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4hs2s" podStartSLOduration=8.903022517 podStartE2EDuration="11.385317981s" podCreationTimestamp="2025-10-08 19:05:09 +0000 UTC" firstStartedPulling="2025-10-08 19:05:11.518400415 +0000 UTC m=+3267.431371448" lastFinishedPulling="2025-10-08 19:05:14.000695889 +0000 UTC m=+3269.913666912" observedRunningTime="2025-10-08 19:05:14.567504475 +0000 UTC m=+3270.480475488" watchObservedRunningTime="2025-10-08 19:05:20.385317981 +0000 UTC m=+3276.298289004" Oct 08 19:05:20 crc kubenswrapper[4750]: I1008 19:05:20.663451 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:20 crc kubenswrapper[4750]: I1008 19:05:20.723372 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hs2s"] Oct 08 19:05:22 crc kubenswrapper[4750]: I1008 19:05:22.611350 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4hs2s" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerName="registry-server" containerID="cri-o://0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e" gracePeriod=2 Oct 08 19:05:22 crc kubenswrapper[4750]: I1008 19:05:22.997292 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.174991 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-utilities\") pod \"6b1f9507-99c3-44f4-8459-a925dae2eda6\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.175165 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xg26\" (UniqueName: \"kubernetes.io/projected/6b1f9507-99c3-44f4-8459-a925dae2eda6-kube-api-access-7xg26\") pod \"6b1f9507-99c3-44f4-8459-a925dae2eda6\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.175205 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-catalog-content\") pod \"6b1f9507-99c3-44f4-8459-a925dae2eda6\" (UID: \"6b1f9507-99c3-44f4-8459-a925dae2eda6\") " Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.176722 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-utilities" (OuterVolumeSpecName: "utilities") pod "6b1f9507-99c3-44f4-8459-a925dae2eda6" (UID: "6b1f9507-99c3-44f4-8459-a925dae2eda6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.189762 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1f9507-99c3-44f4-8459-a925dae2eda6-kube-api-access-7xg26" (OuterVolumeSpecName: "kube-api-access-7xg26") pod "6b1f9507-99c3-44f4-8459-a925dae2eda6" (UID: "6b1f9507-99c3-44f4-8459-a925dae2eda6"). InnerVolumeSpecName "kube-api-access-7xg26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.235045 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b1f9507-99c3-44f4-8459-a925dae2eda6" (UID: "6b1f9507-99c3-44f4-8459-a925dae2eda6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.277220 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.277263 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xg26\" (UniqueName: \"kubernetes.io/projected/6b1f9507-99c3-44f4-8459-a925dae2eda6-kube-api-access-7xg26\") on node \"crc\" DevicePath \"\"" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.277276 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f9507-99c3-44f4-8459-a925dae2eda6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.624842 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerID="0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e" exitCode=0 Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.624917 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hs2s" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.624940 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hs2s" event={"ID":"6b1f9507-99c3-44f4-8459-a925dae2eda6","Type":"ContainerDied","Data":"0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e"} Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.625412 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hs2s" event={"ID":"6b1f9507-99c3-44f4-8459-a925dae2eda6","Type":"ContainerDied","Data":"003dac10b8ee0386f9992b0ff5e4f355bd9c6eea94eebb2898e46675a7d715f1"} Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.625433 4750 scope.go:117] "RemoveContainer" containerID="0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.661190 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hs2s"] Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.669374 4750 scope.go:117] "RemoveContainer" containerID="fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.671689 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4hs2s"] Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.702123 4750 scope.go:117] "RemoveContainer" containerID="8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.734923 4750 scope.go:117] "RemoveContainer" containerID="0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e" Oct 08 19:05:23 crc kubenswrapper[4750]: E1008 19:05:23.735522 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e\": container with ID starting with 0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e not found: ID does not exist" containerID="0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.735578 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e"} err="failed to get container status \"0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e\": rpc error: code = NotFound desc = could not find container \"0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e\": container with ID starting with 0351137d6cb39191db2ba7c1e3fc726cb25a1c093edb8d735076975732d23b3e not found: ID does not exist" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.735605 4750 scope.go:117] "RemoveContainer" containerID="fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1" Oct 08 19:05:23 crc kubenswrapper[4750]: E1008 19:05:23.736358 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1\": container with ID starting with fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1 not found: ID does not exist" containerID="fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.736388 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1"} err="failed to get container status \"fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1\": rpc error: code = NotFound desc = could not find container \"fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1\": container with ID starting with fe16832e65e51485c4dbce3184b83b9681061c5701f6a71658bf19bc16c512f1 not found: ID does not exist" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.736406 4750 scope.go:117] "RemoveContainer" containerID="8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35" Oct 08 19:05:23 crc kubenswrapper[4750]: E1008 19:05:23.736691 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35\": container with ID starting with 8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35 not found: ID does not exist" containerID="8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35" Oct 08 19:05:23 crc kubenswrapper[4750]: I1008 19:05:23.736718 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35"} err="failed to get container status \"8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35\": rpc error: code = NotFound desc = could not find container \"8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35\": container with ID starting with 8a1aa0c3336ec19635b49c81fdc3614150b88cafaaab0bc678d22429a9036e35 not found: ID does not exist" Oct 08 19:05:24 crc kubenswrapper[4750]: I1008 19:05:24.743470 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" path="/var/lib/kubelet/pods/6b1f9507-99c3-44f4-8459-a925dae2eda6/volumes" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.222345 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hf72g"] Oct 08 19:06:59 crc kubenswrapper[4750]: E1008 19:06:59.223429 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerName="extract-content" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.223443 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerName="extract-content" Oct 08 19:06:59 crc kubenswrapper[4750]: E1008 19:06:59.223458 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerName="registry-server" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.223464 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerName="registry-server" Oct 08 19:06:59 crc kubenswrapper[4750]: E1008 19:06:59.223479 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerName="extract-utilities" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.223485 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerName="extract-utilities" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.223778 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1f9507-99c3-44f4-8459-a925dae2eda6" containerName="registry-server" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.225085 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.247288 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hf72g"] Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.342062 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-utilities\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.342145 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pbmb\" (UniqueName: \"kubernetes.io/projected/be318c07-cf31-48f4-8916-65d15c919e26-kube-api-access-2pbmb\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.342188 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-catalog-content\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.444202 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pbmb\" (UniqueName: \"kubernetes.io/projected/be318c07-cf31-48f4-8916-65d15c919e26-kube-api-access-2pbmb\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.444283 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-catalog-content\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.444370 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-utilities\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.445007 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-utilities\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.445597 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-catalog-content\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.469692 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pbmb\" (UniqueName: \"kubernetes.io/projected/be318c07-cf31-48f4-8916-65d15c919e26-kube-api-access-2pbmb\") pod \"certified-operators-hf72g\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:06:59 crc kubenswrapper[4750]: I1008 19:06:59.550583 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:07:00 crc kubenswrapper[4750]: I1008 19:07:00.056514 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hf72g"] Oct 08 19:07:00 crc kubenswrapper[4750]: I1008 19:07:00.394295 4750 generic.go:334] "Generic (PLEG): container finished" podID="be318c07-cf31-48f4-8916-65d15c919e26" containerID="fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2" exitCode=0 Oct 08 19:07:00 crc kubenswrapper[4750]: I1008 19:07:00.394356 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hf72g" event={"ID":"be318c07-cf31-48f4-8916-65d15c919e26","Type":"ContainerDied","Data":"fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2"} Oct 08 19:07:00 crc kubenswrapper[4750]: I1008 19:07:00.394398 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hf72g" event={"ID":"be318c07-cf31-48f4-8916-65d15c919e26","Type":"ContainerStarted","Data":"288b8739955f6614415449ca328e33ef19415cf4958287fc5ee8fb3f605e3d74"} Oct 08 19:07:01 crc kubenswrapper[4750]: I1008 19:07:01.406577 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hf72g" event={"ID":"be318c07-cf31-48f4-8916-65d15c919e26","Type":"ContainerStarted","Data":"67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa"} Oct 08 19:07:02 crc kubenswrapper[4750]: I1008 19:07:02.425448 4750 generic.go:334] "Generic (PLEG): container finished" podID="be318c07-cf31-48f4-8916-65d15c919e26" containerID="67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa" exitCode=0 Oct 08 19:07:02 crc kubenswrapper[4750]: I1008 19:07:02.425849 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hf72g" event={"ID":"be318c07-cf31-48f4-8916-65d15c919e26","Type":"ContainerDied","Data":"67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa"} Oct 08 19:07:03 crc kubenswrapper[4750]: I1008 19:07:03.435954 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hf72g" event={"ID":"be318c07-cf31-48f4-8916-65d15c919e26","Type":"ContainerStarted","Data":"93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f"} Oct 08 19:07:03 crc kubenswrapper[4750]: I1008 19:07:03.463890 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hf72g" podStartSLOduration=2.012207498 podStartE2EDuration="4.46387599s" podCreationTimestamp="2025-10-08 19:06:59 +0000 UTC" firstStartedPulling="2025-10-08 19:07:00.396027917 +0000 UTC m=+3376.308998930" lastFinishedPulling="2025-10-08 19:07:02.847696349 +0000 UTC m=+3378.760667422" observedRunningTime="2025-10-08 19:07:03.462064865 +0000 UTC m=+3379.375035878" watchObservedRunningTime="2025-10-08 19:07:03.46387599 +0000 UTC m=+3379.376847003" Oct 08 19:07:09 crc kubenswrapper[4750]: I1008 19:07:09.551229 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:07:09 crc kubenswrapper[4750]: I1008 19:07:09.551842 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:07:09 crc kubenswrapper[4750]: I1008 19:07:09.603248 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:07:10 crc kubenswrapper[4750]: I1008 19:07:10.537312 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:07:10 crc kubenswrapper[4750]: I1008 19:07:10.589925 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hf72g"] Oct 08 19:07:12 crc kubenswrapper[4750]: I1008 19:07:12.505840 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hf72g" podUID="be318c07-cf31-48f4-8916-65d15c919e26" containerName="registry-server" containerID="cri-o://93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f" gracePeriod=2 Oct 08 19:07:12 crc kubenswrapper[4750]: I1008 19:07:12.873297 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:07:12 crc kubenswrapper[4750]: I1008 19:07:12.947160 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-utilities\") pod \"be318c07-cf31-48f4-8916-65d15c919e26\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " Oct 08 19:07:12 crc kubenswrapper[4750]: I1008 19:07:12.947506 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pbmb\" (UniqueName: \"kubernetes.io/projected/be318c07-cf31-48f4-8916-65d15c919e26-kube-api-access-2pbmb\") pod \"be318c07-cf31-48f4-8916-65d15c919e26\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " Oct 08 19:07:12 crc kubenswrapper[4750]: I1008 19:07:12.947531 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-catalog-content\") pod \"be318c07-cf31-48f4-8916-65d15c919e26\" (UID: \"be318c07-cf31-48f4-8916-65d15c919e26\") " Oct 08 19:07:12 crc kubenswrapper[4750]: I1008 19:07:12.948259 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-utilities" (OuterVolumeSpecName: "utilities") pod "be318c07-cf31-48f4-8916-65d15c919e26" (UID: "be318c07-cf31-48f4-8916-65d15c919e26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:07:12 crc kubenswrapper[4750]: I1008 19:07:12.953999 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be318c07-cf31-48f4-8916-65d15c919e26-kube-api-access-2pbmb" (OuterVolumeSpecName: "kube-api-access-2pbmb") pod "be318c07-cf31-48f4-8916-65d15c919e26" (UID: "be318c07-cf31-48f4-8916-65d15c919e26"). InnerVolumeSpecName "kube-api-access-2pbmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:07:12 crc kubenswrapper[4750]: I1008 19:07:12.995335 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be318c07-cf31-48f4-8916-65d15c919e26" (UID: "be318c07-cf31-48f4-8916-65d15c919e26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.049302 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.049337 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pbmb\" (UniqueName: \"kubernetes.io/projected/be318c07-cf31-48f4-8916-65d15c919e26-kube-api-access-2pbmb\") on node \"crc\" DevicePath \"\"" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.049373 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be318c07-cf31-48f4-8916-65d15c919e26-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.521889 4750 generic.go:334] "Generic (PLEG): container finished" podID="be318c07-cf31-48f4-8916-65d15c919e26" containerID="93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f" exitCode=0 Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.521951 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hf72g" event={"ID":"be318c07-cf31-48f4-8916-65d15c919e26","Type":"ContainerDied","Data":"93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f"} Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.521992 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hf72g" event={"ID":"be318c07-cf31-48f4-8916-65d15c919e26","Type":"ContainerDied","Data":"288b8739955f6614415449ca328e33ef19415cf4958287fc5ee8fb3f605e3d74"} Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.522001 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hf72g" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.522022 4750 scope.go:117] "RemoveContainer" containerID="93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.551423 4750 scope.go:117] "RemoveContainer" containerID="67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.577586 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hf72g"] Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.592065 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hf72g"] Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.593651 4750 scope.go:117] "RemoveContainer" containerID="fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.611506 4750 scope.go:117] "RemoveContainer" containerID="93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f" Oct 08 19:07:13 crc kubenswrapper[4750]: E1008 19:07:13.612362 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f\": container with ID starting with 93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f not found: ID does not exist" containerID="93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.612387 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f"} err="failed to get container status \"93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f\": rpc error: code = NotFound desc = could not find container \"93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f\": container with ID starting with 93e32744b4c410da50516e5b1aa6a122a87f17511581b809d653225e70b9ad7f not found: ID does not exist" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.612410 4750 scope.go:117] "RemoveContainer" containerID="67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa" Oct 08 19:07:13 crc kubenswrapper[4750]: E1008 19:07:13.612850 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa\": container with ID starting with 67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa not found: ID does not exist" containerID="67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.612869 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa"} err="failed to get container status \"67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa\": rpc error: code = NotFound desc = could not find container \"67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa\": container with ID starting with 67f9526b8c8304753a341d8cbaba997009ef30080ddd52a4a1021be309e54baa not found: ID does not exist" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.612884 4750 scope.go:117] "RemoveContainer" containerID="fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2" Oct 08 19:07:13 crc kubenswrapper[4750]: E1008 19:07:13.613093 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2\": container with ID starting with fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2 not found: ID does not exist" containerID="fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2" Oct 08 19:07:13 crc kubenswrapper[4750]: I1008 19:07:13.613117 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2"} err="failed to get container status \"fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2\": rpc error: code = NotFound desc = could not find container \"fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2\": container with ID starting with fb64039e8fe3f0b72f0687f1e875f1fa27e07e08afcd1e2e29f7bc46893e9fe2 not found: ID does not exist" Oct 08 19:07:14 crc kubenswrapper[4750]: I1008 19:07:14.742124 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be318c07-cf31-48f4-8916-65d15c919e26" path="/var/lib/kubelet/pods/be318c07-cf31-48f4-8916-65d15c919e26/volumes" Oct 08 19:07:29 crc kubenswrapper[4750]: I1008 19:07:29.707393 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:07:29 crc kubenswrapper[4750]: I1008 19:07:29.707952 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:07:59 crc kubenswrapper[4750]: I1008 19:07:59.706802 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:07:59 crc kubenswrapper[4750]: I1008 19:07:59.707655 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:08:29 crc kubenswrapper[4750]: I1008 19:08:29.707319 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:08:29 crc kubenswrapper[4750]: I1008 19:08:29.707868 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:08:29 crc kubenswrapper[4750]: I1008 19:08:29.707914 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:08:29 crc kubenswrapper[4750]: I1008 19:08:29.708536 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8614657f63c02aa8d66cea0f65177fa4035c6abef7acecdb30121d5573860c4f"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:08:29 crc kubenswrapper[4750]: I1008 19:08:29.708619 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://8614657f63c02aa8d66cea0f65177fa4035c6abef7acecdb30121d5573860c4f" gracePeriod=600 Oct 08 19:08:30 crc kubenswrapper[4750]: I1008 19:08:30.175420 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="8614657f63c02aa8d66cea0f65177fa4035c6abef7acecdb30121d5573860c4f" exitCode=0 Oct 08 19:08:30 crc kubenswrapper[4750]: I1008 19:08:30.175477 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"8614657f63c02aa8d66cea0f65177fa4035c6abef7acecdb30121d5573860c4f"} Oct 08 19:08:30 crc kubenswrapper[4750]: I1008 19:08:30.175877 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658"} Oct 08 19:08:30 crc kubenswrapper[4750]: I1008 19:08:30.175896 4750 scope.go:117] "RemoveContainer" containerID="436e6cbb69c758f2d6ea6ab45cdf55c57954b7c7c767a43627a47a0e76aa0fea" Oct 08 19:10:59 crc kubenswrapper[4750]: I1008 19:10:59.707626 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:10:59 crc kubenswrapper[4750]: I1008 19:10:59.708540 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:11:29 crc kubenswrapper[4750]: I1008 19:11:29.707115 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:11:29 crc kubenswrapper[4750]: I1008 19:11:29.709251 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.706931 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.707848 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.707921 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.708743 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.708810 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" gracePeriod=600 Oct 08 19:11:59 crc kubenswrapper[4750]: E1008 19:11:59.831194 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.990828 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" exitCode=0 Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.990899 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658"} Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.990977 4750 scope.go:117] "RemoveContainer" containerID="8614657f63c02aa8d66cea0f65177fa4035c6abef7acecdb30121d5573860c4f" Oct 08 19:11:59 crc kubenswrapper[4750]: I1008 19:11:59.991866 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:11:59 crc kubenswrapper[4750]: E1008 19:11:59.992176 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:12:14 crc kubenswrapper[4750]: I1008 19:12:14.738686 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:12:14 crc kubenswrapper[4750]: E1008 19:12:14.739614 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:12:29 crc kubenswrapper[4750]: I1008 19:12:29.734691 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:12:29 crc kubenswrapper[4750]: E1008 19:12:29.735648 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:12:43 crc kubenswrapper[4750]: I1008 19:12:43.735091 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:12:43 crc kubenswrapper[4750]: E1008 19:12:43.735942 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:12:57 crc kubenswrapper[4750]: I1008 19:12:57.735012 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:12:57 crc kubenswrapper[4750]: E1008 19:12:57.736024 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:13:08 crc kubenswrapper[4750]: I1008 19:13:08.735153 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:13:08 crc kubenswrapper[4750]: E1008 19:13:08.736108 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:13:20 crc kubenswrapper[4750]: I1008 19:13:20.734193 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:13:20 crc kubenswrapper[4750]: E1008 19:13:20.734937 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:13:34 crc kubenswrapper[4750]: I1008 19:13:34.740182 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:13:34 crc kubenswrapper[4750]: E1008 19:13:34.740923 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:13:49 crc kubenswrapper[4750]: I1008 19:13:49.734695 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:13:49 crc kubenswrapper[4750]: E1008 19:13:49.735923 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:14:03 crc kubenswrapper[4750]: I1008 19:14:03.734093 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:14:03 crc kubenswrapper[4750]: E1008 19:14:03.734817 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:14:16 crc kubenswrapper[4750]: I1008 19:14:16.735484 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:14:16 crc kubenswrapper[4750]: E1008 19:14:16.736669 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.546121 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mwgbz"] Oct 08 19:14:17 crc kubenswrapper[4750]: E1008 19:14:17.546507 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be318c07-cf31-48f4-8916-65d15c919e26" containerName="extract-content" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.546524 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be318c07-cf31-48f4-8916-65d15c919e26" containerName="extract-content" Oct 08 19:14:17 crc kubenswrapper[4750]: E1008 19:14:17.546547 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be318c07-cf31-48f4-8916-65d15c919e26" containerName="extract-utilities" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.546567 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be318c07-cf31-48f4-8916-65d15c919e26" containerName="extract-utilities" Oct 08 19:14:17 crc kubenswrapper[4750]: E1008 19:14:17.546588 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be318c07-cf31-48f4-8916-65d15c919e26" containerName="registry-server" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.546624 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="be318c07-cf31-48f4-8916-65d15c919e26" containerName="registry-server" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.546771 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="be318c07-cf31-48f4-8916-65d15c919e26" containerName="registry-server" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.547960 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.562349 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwgbz"] Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.696904 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxcmv\" (UniqueName: \"kubernetes.io/projected/3c088cc2-322c-48ee-96d8-2e13ea092343-kube-api-access-dxcmv\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.697365 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-utilities\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.708082 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-catalog-content\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.816595 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxcmv\" (UniqueName: \"kubernetes.io/projected/3c088cc2-322c-48ee-96d8-2e13ea092343-kube-api-access-dxcmv\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.816652 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-utilities\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.816692 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-catalog-content\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.818251 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-utilities\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.818742 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-catalog-content\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.844027 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxcmv\" (UniqueName: \"kubernetes.io/projected/3c088cc2-322c-48ee-96d8-2e13ea092343-kube-api-access-dxcmv\") pod \"redhat-marketplace-mwgbz\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:17 crc kubenswrapper[4750]: I1008 19:14:17.874127 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:18 crc kubenswrapper[4750]: I1008 19:14:18.406361 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwgbz"] Oct 08 19:14:19 crc kubenswrapper[4750]: I1008 19:14:19.170068 4750 generic.go:334] "Generic (PLEG): container finished" podID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerID="65d6a4b0ab11a4261f489410f11dc93f0732985641c8c5aacdfec69bce19a220" exitCode=0 Oct 08 19:14:19 crc kubenswrapper[4750]: I1008 19:14:19.170287 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwgbz" event={"ID":"3c088cc2-322c-48ee-96d8-2e13ea092343","Type":"ContainerDied","Data":"65d6a4b0ab11a4261f489410f11dc93f0732985641c8c5aacdfec69bce19a220"} Oct 08 19:14:19 crc kubenswrapper[4750]: I1008 19:14:19.170519 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwgbz" event={"ID":"3c088cc2-322c-48ee-96d8-2e13ea092343","Type":"ContainerStarted","Data":"ba7e96c8e69c826f8095c58c6d1973522a5f9632e067b4869544de06d2316927"} Oct 08 19:14:19 crc kubenswrapper[4750]: I1008 19:14:19.172288 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:14:20 crc kubenswrapper[4750]: I1008 19:14:20.180071 4750 generic.go:334] "Generic (PLEG): container finished" podID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerID="c2da52f129fa7219e426e50f363d235699405f26d65d263114b678d644436c11" exitCode=0 Oct 08 19:14:20 crc kubenswrapper[4750]: I1008 19:14:20.180156 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwgbz" event={"ID":"3c088cc2-322c-48ee-96d8-2e13ea092343","Type":"ContainerDied","Data":"c2da52f129fa7219e426e50f363d235699405f26d65d263114b678d644436c11"} Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.190763 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwgbz" event={"ID":"3c088cc2-322c-48ee-96d8-2e13ea092343","Type":"ContainerStarted","Data":"d6258329b7e1ed59d614f50d67dd0cea4955c8d7bac972e1e35a3004ab6fa1fb"} Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.212811 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mwgbz" podStartSLOduration=2.774634825 podStartE2EDuration="4.21279214s" podCreationTimestamp="2025-10-08 19:14:17 +0000 UTC" firstStartedPulling="2025-10-08 19:14:19.172017479 +0000 UTC m=+3815.084988492" lastFinishedPulling="2025-10-08 19:14:20.610174794 +0000 UTC m=+3816.523145807" observedRunningTime="2025-10-08 19:14:21.211870307 +0000 UTC m=+3817.124841330" watchObservedRunningTime="2025-10-08 19:14:21.21279214 +0000 UTC m=+3817.125763163" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.539190 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d2czd"] Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.541810 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.560898 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2czd"] Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.589981 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9gfr\" (UniqueName: \"kubernetes.io/projected/7babf26e-72ab-4cf8-af50-76478cfab7ba-kube-api-access-k9gfr\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.590035 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-catalog-content\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.590148 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-utilities\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.691447 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9gfr\" (UniqueName: \"kubernetes.io/projected/7babf26e-72ab-4cf8-af50-76478cfab7ba-kube-api-access-k9gfr\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.691527 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-catalog-content\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.691681 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-utilities\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.692289 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-utilities\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.692435 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-catalog-content\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.717524 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9gfr\" (UniqueName: \"kubernetes.io/projected/7babf26e-72ab-4cf8-af50-76478cfab7ba-kube-api-access-k9gfr\") pod \"redhat-operators-d2czd\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:21 crc kubenswrapper[4750]: I1008 19:14:21.889031 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:22 crc kubenswrapper[4750]: I1008 19:14:22.130981 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2czd"] Oct 08 19:14:22 crc kubenswrapper[4750]: W1008 19:14:22.138459 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7babf26e_72ab_4cf8_af50_76478cfab7ba.slice/crio-862dedd8f66db5548abedb26e9e27d9b34666e7e733d31aa88c513243ee63651 WatchSource:0}: Error finding container 862dedd8f66db5548abedb26e9e27d9b34666e7e733d31aa88c513243ee63651: Status 404 returned error can't find the container with id 862dedd8f66db5548abedb26e9e27d9b34666e7e733d31aa88c513243ee63651 Oct 08 19:14:22 crc kubenswrapper[4750]: I1008 19:14:22.202128 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2czd" event={"ID":"7babf26e-72ab-4cf8-af50-76478cfab7ba","Type":"ContainerStarted","Data":"862dedd8f66db5548abedb26e9e27d9b34666e7e733d31aa88c513243ee63651"} Oct 08 19:14:23 crc kubenswrapper[4750]: I1008 19:14:23.210831 4750 generic.go:334] "Generic (PLEG): container finished" podID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerID="1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d" exitCode=0 Oct 08 19:14:23 crc kubenswrapper[4750]: I1008 19:14:23.210945 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2czd" event={"ID":"7babf26e-72ab-4cf8-af50-76478cfab7ba","Type":"ContainerDied","Data":"1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d"} Oct 08 19:14:24 crc kubenswrapper[4750]: I1008 19:14:24.224011 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2czd" event={"ID":"7babf26e-72ab-4cf8-af50-76478cfab7ba","Type":"ContainerStarted","Data":"0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b"} Oct 08 19:14:25 crc kubenswrapper[4750]: I1008 19:14:25.238406 4750 generic.go:334] "Generic (PLEG): container finished" podID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerID="0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b" exitCode=0 Oct 08 19:14:25 crc kubenswrapper[4750]: I1008 19:14:25.238522 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2czd" event={"ID":"7babf26e-72ab-4cf8-af50-76478cfab7ba","Type":"ContainerDied","Data":"0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b"} Oct 08 19:14:26 crc kubenswrapper[4750]: I1008 19:14:26.252960 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2czd" event={"ID":"7babf26e-72ab-4cf8-af50-76478cfab7ba","Type":"ContainerStarted","Data":"5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739"} Oct 08 19:14:26 crc kubenswrapper[4750]: I1008 19:14:26.281674 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d2czd" podStartSLOduration=2.823894269 podStartE2EDuration="5.281647914s" podCreationTimestamp="2025-10-08 19:14:21 +0000 UTC" firstStartedPulling="2025-10-08 19:14:23.21264241 +0000 UTC m=+3819.125613423" lastFinishedPulling="2025-10-08 19:14:25.670396055 +0000 UTC m=+3821.583367068" observedRunningTime="2025-10-08 19:14:26.276088467 +0000 UTC m=+3822.189059500" watchObservedRunningTime="2025-10-08 19:14:26.281647914 +0000 UTC m=+3822.194618927" Oct 08 19:14:27 crc kubenswrapper[4750]: I1008 19:14:27.734858 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:14:27 crc kubenswrapper[4750]: E1008 19:14:27.735149 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:14:27 crc kubenswrapper[4750]: I1008 19:14:27.875237 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:27 crc kubenswrapper[4750]: I1008 19:14:27.875309 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:27 crc kubenswrapper[4750]: I1008 19:14:27.942963 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:28 crc kubenswrapper[4750]: I1008 19:14:28.307518 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:29 crc kubenswrapper[4750]: I1008 19:14:29.927059 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwgbz"] Oct 08 19:14:30 crc kubenswrapper[4750]: I1008 19:14:30.282112 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mwgbz" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerName="registry-server" containerID="cri-o://d6258329b7e1ed59d614f50d67dd0cea4955c8d7bac972e1e35a3004ab6fa1fb" gracePeriod=2 Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.295833 4750 generic.go:334] "Generic (PLEG): container finished" podID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerID="d6258329b7e1ed59d614f50d67dd0cea4955c8d7bac972e1e35a3004ab6fa1fb" exitCode=0 Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.295916 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwgbz" event={"ID":"3c088cc2-322c-48ee-96d8-2e13ea092343","Type":"ContainerDied","Data":"d6258329b7e1ed59d614f50d67dd0cea4955c8d7bac972e1e35a3004ab6fa1fb"} Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.451187 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.544605 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxcmv\" (UniqueName: \"kubernetes.io/projected/3c088cc2-322c-48ee-96d8-2e13ea092343-kube-api-access-dxcmv\") pod \"3c088cc2-322c-48ee-96d8-2e13ea092343\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.544661 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-utilities\") pod \"3c088cc2-322c-48ee-96d8-2e13ea092343\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.544755 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-catalog-content\") pod \"3c088cc2-322c-48ee-96d8-2e13ea092343\" (UID: \"3c088cc2-322c-48ee-96d8-2e13ea092343\") " Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.545427 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-utilities" (OuterVolumeSpecName: "utilities") pod "3c088cc2-322c-48ee-96d8-2e13ea092343" (UID: "3c088cc2-322c-48ee-96d8-2e13ea092343"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.550731 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c088cc2-322c-48ee-96d8-2e13ea092343-kube-api-access-dxcmv" (OuterVolumeSpecName: "kube-api-access-dxcmv") pod "3c088cc2-322c-48ee-96d8-2e13ea092343" (UID: "3c088cc2-322c-48ee-96d8-2e13ea092343"). InnerVolumeSpecName "kube-api-access-dxcmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.559428 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c088cc2-322c-48ee-96d8-2e13ea092343" (UID: "3c088cc2-322c-48ee-96d8-2e13ea092343"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.646052 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.646086 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxcmv\" (UniqueName: \"kubernetes.io/projected/3c088cc2-322c-48ee-96d8-2e13ea092343-kube-api-access-dxcmv\") on node \"crc\" DevicePath \"\"" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.646100 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c088cc2-322c-48ee-96d8-2e13ea092343-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.889679 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.889990 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:31 crc kubenswrapper[4750]: I1008 19:14:31.943188 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.306024 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mwgbz" event={"ID":"3c088cc2-322c-48ee-96d8-2e13ea092343","Type":"ContainerDied","Data":"ba7e96c8e69c826f8095c58c6d1973522a5f9632e067b4869544de06d2316927"} Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.306050 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mwgbz" Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.306087 4750 scope.go:117] "RemoveContainer" containerID="d6258329b7e1ed59d614f50d67dd0cea4955c8d7bac972e1e35a3004ab6fa1fb" Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.322603 4750 scope.go:117] "RemoveContainer" containerID="c2da52f129fa7219e426e50f363d235699405f26d65d263114b678d644436c11" Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.349822 4750 scope.go:117] "RemoveContainer" containerID="65d6a4b0ab11a4261f489410f11dc93f0732985641c8c5aacdfec69bce19a220" Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.351725 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwgbz"] Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.360496 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mwgbz"] Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.380401 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:32 crc kubenswrapper[4750]: I1008 19:14:32.742057 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" path="/var/lib/kubelet/pods/3c088cc2-322c-48ee-96d8-2e13ea092343/volumes" Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.125957 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d2czd"] Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.322102 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d2czd" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerName="registry-server" containerID="cri-o://5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739" gracePeriod=2 Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.728302 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.893602 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-catalog-content\") pod \"7babf26e-72ab-4cf8-af50-76478cfab7ba\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.893887 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9gfr\" (UniqueName: \"kubernetes.io/projected/7babf26e-72ab-4cf8-af50-76478cfab7ba-kube-api-access-k9gfr\") pod \"7babf26e-72ab-4cf8-af50-76478cfab7ba\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.893969 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-utilities\") pod \"7babf26e-72ab-4cf8-af50-76478cfab7ba\" (UID: \"7babf26e-72ab-4cf8-af50-76478cfab7ba\") " Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.895006 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-utilities" (OuterVolumeSpecName: "utilities") pod "7babf26e-72ab-4cf8-af50-76478cfab7ba" (UID: "7babf26e-72ab-4cf8-af50-76478cfab7ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.906897 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7babf26e-72ab-4cf8-af50-76478cfab7ba-kube-api-access-k9gfr" (OuterVolumeSpecName: "kube-api-access-k9gfr") pod "7babf26e-72ab-4cf8-af50-76478cfab7ba" (UID: "7babf26e-72ab-4cf8-af50-76478cfab7ba"). InnerVolumeSpecName "kube-api-access-k9gfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.996266 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9gfr\" (UniqueName: \"kubernetes.io/projected/7babf26e-72ab-4cf8-af50-76478cfab7ba-kube-api-access-k9gfr\") on node \"crc\" DevicePath \"\"" Oct 08 19:14:34 crc kubenswrapper[4750]: I1008 19:14:34.996325 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.334995 4750 generic.go:334] "Generic (PLEG): container finished" podID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerID="5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739" exitCode=0 Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.335069 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2czd" event={"ID":"7babf26e-72ab-4cf8-af50-76478cfab7ba","Type":"ContainerDied","Data":"5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739"} Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.335087 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2czd" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.335116 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2czd" event={"ID":"7babf26e-72ab-4cf8-af50-76478cfab7ba","Type":"ContainerDied","Data":"862dedd8f66db5548abedb26e9e27d9b34666e7e733d31aa88c513243ee63651"} Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.335150 4750 scope.go:117] "RemoveContainer" containerID="5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.362096 4750 scope.go:117] "RemoveContainer" containerID="0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.388112 4750 scope.go:117] "RemoveContainer" containerID="1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.412533 4750 scope.go:117] "RemoveContainer" containerID="5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739" Oct 08 19:14:35 crc kubenswrapper[4750]: E1008 19:14:35.413308 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739\": container with ID starting with 5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739 not found: ID does not exist" containerID="5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.413398 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739"} err="failed to get container status \"5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739\": rpc error: code = NotFound desc = could not find container \"5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739\": container with ID starting with 5462fa9aa27387da9689bee0631e40e14a97c769b8825a30a427f585fe1a4739 not found: ID does not exist" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.413442 4750 scope.go:117] "RemoveContainer" containerID="0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b" Oct 08 19:14:35 crc kubenswrapper[4750]: E1008 19:14:35.414016 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b\": container with ID starting with 0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b not found: ID does not exist" containerID="0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.414052 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b"} err="failed to get container status \"0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b\": rpc error: code = NotFound desc = could not find container \"0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b\": container with ID starting with 0e9ae509520ae422c1218345a0ac802372ada7893ffffd719c9f93fc04699a4b not found: ID does not exist" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.414074 4750 scope.go:117] "RemoveContainer" containerID="1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d" Oct 08 19:14:35 crc kubenswrapper[4750]: E1008 19:14:35.415351 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d\": container with ID starting with 1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d not found: ID does not exist" containerID="1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.415412 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d"} err="failed to get container status \"1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d\": rpc error: code = NotFound desc = could not find container \"1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d\": container with ID starting with 1c9863a63028d4e2669a56769715c8d79473665cc865fb64befcdd3e92f3f83d not found: ID does not exist" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.735400 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7babf26e-72ab-4cf8-af50-76478cfab7ba" (UID: "7babf26e-72ab-4cf8-af50-76478cfab7ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.820483 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7babf26e-72ab-4cf8-af50-76478cfab7ba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.988019 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d2czd"] Oct 08 19:14:35 crc kubenswrapper[4750]: I1008 19:14:35.998037 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d2czd"] Oct 08 19:14:36 crc kubenswrapper[4750]: I1008 19:14:36.745797 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" path="/var/lib/kubelet/pods/7babf26e-72ab-4cf8-af50-76478cfab7ba/volumes" Oct 08 19:14:38 crc kubenswrapper[4750]: I1008 19:14:38.734440 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:14:38 crc kubenswrapper[4750]: E1008 19:14:38.735393 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:14:53 crc kubenswrapper[4750]: I1008 19:14:53.734899 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:14:53 crc kubenswrapper[4750]: E1008 19:14:53.735791 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.173375 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr"] Oct 08 19:15:00 crc kubenswrapper[4750]: E1008 19:15:00.174626 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerName="extract-content" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.174643 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerName="extract-content" Oct 08 19:15:00 crc kubenswrapper[4750]: E1008 19:15:00.174661 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerName="extract-content" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.174668 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerName="extract-content" Oct 08 19:15:00 crc kubenswrapper[4750]: E1008 19:15:00.174686 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerName="registry-server" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.174692 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerName="registry-server" Oct 08 19:15:00 crc kubenswrapper[4750]: E1008 19:15:00.174702 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerName="extract-utilities" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.174709 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerName="extract-utilities" Oct 08 19:15:00 crc kubenswrapper[4750]: E1008 19:15:00.174729 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerName="registry-server" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.174735 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerName="registry-server" Oct 08 19:15:00 crc kubenswrapper[4750]: E1008 19:15:00.174756 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerName="extract-utilities" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.174762 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerName="extract-utilities" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.174910 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c088cc2-322c-48ee-96d8-2e13ea092343" containerName="registry-server" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.174929 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7babf26e-72ab-4cf8-af50-76478cfab7ba" containerName="registry-server" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.175504 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.179390 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.179480 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.180028 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr"] Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.318949 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a453be50-65d1-4bc8-a677-42456dc355d5-config-volume\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.319250 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a453be50-65d1-4bc8-a677-42456dc355d5-secret-volume\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.319416 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4fs\" (UniqueName: \"kubernetes.io/projected/a453be50-65d1-4bc8-a677-42456dc355d5-kube-api-access-kt4fs\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.420927 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a453be50-65d1-4bc8-a677-42456dc355d5-secret-volume\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.421052 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt4fs\" (UniqueName: \"kubernetes.io/projected/a453be50-65d1-4bc8-a677-42456dc355d5-kube-api-access-kt4fs\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.421112 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a453be50-65d1-4bc8-a677-42456dc355d5-config-volume\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.422209 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a453be50-65d1-4bc8-a677-42456dc355d5-config-volume\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.427706 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a453be50-65d1-4bc8-a677-42456dc355d5-secret-volume\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.439166 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt4fs\" (UniqueName: \"kubernetes.io/projected/a453be50-65d1-4bc8-a677-42456dc355d5-kube-api-access-kt4fs\") pod \"collect-profiles-29332515-t26kr\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.502938 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:00 crc kubenswrapper[4750]: I1008 19:15:00.956215 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr"] Oct 08 19:15:01 crc kubenswrapper[4750]: I1008 19:15:01.582969 4750 generic.go:334] "Generic (PLEG): container finished" podID="a453be50-65d1-4bc8-a677-42456dc355d5" containerID="cf0812dd8debcf4c3eab99a904a0e6fbf5573f1be9f45fd5d5ee2781a9548443" exitCode=0 Oct 08 19:15:01 crc kubenswrapper[4750]: I1008 19:15:01.583487 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" event={"ID":"a453be50-65d1-4bc8-a677-42456dc355d5","Type":"ContainerDied","Data":"cf0812dd8debcf4c3eab99a904a0e6fbf5573f1be9f45fd5d5ee2781a9548443"} Oct 08 19:15:01 crc kubenswrapper[4750]: I1008 19:15:01.583604 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" event={"ID":"a453be50-65d1-4bc8-a677-42456dc355d5","Type":"ContainerStarted","Data":"0525840f0686a6c1fab735d9a987c1f6cde8a50de3ba3c8badf70044626c01a8"} Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.846538 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.864260 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a453be50-65d1-4bc8-a677-42456dc355d5-config-volume\") pod \"a453be50-65d1-4bc8-a677-42456dc355d5\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.864413 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a453be50-65d1-4bc8-a677-42456dc355d5-secret-volume\") pod \"a453be50-65d1-4bc8-a677-42456dc355d5\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.864456 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt4fs\" (UniqueName: \"kubernetes.io/projected/a453be50-65d1-4bc8-a677-42456dc355d5-kube-api-access-kt4fs\") pod \"a453be50-65d1-4bc8-a677-42456dc355d5\" (UID: \"a453be50-65d1-4bc8-a677-42456dc355d5\") " Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.866443 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a453be50-65d1-4bc8-a677-42456dc355d5-config-volume" (OuterVolumeSpecName: "config-volume") pod "a453be50-65d1-4bc8-a677-42456dc355d5" (UID: "a453be50-65d1-4bc8-a677-42456dc355d5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.870510 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a453be50-65d1-4bc8-a677-42456dc355d5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a453be50-65d1-4bc8-a677-42456dc355d5" (UID: "a453be50-65d1-4bc8-a677-42456dc355d5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.871083 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a453be50-65d1-4bc8-a677-42456dc355d5-kube-api-access-kt4fs" (OuterVolumeSpecName: "kube-api-access-kt4fs") pod "a453be50-65d1-4bc8-a677-42456dc355d5" (UID: "a453be50-65d1-4bc8-a677-42456dc355d5"). InnerVolumeSpecName "kube-api-access-kt4fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.965938 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a453be50-65d1-4bc8-a677-42456dc355d5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.965972 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a453be50-65d1-4bc8-a677-42456dc355d5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 19:15:02 crc kubenswrapper[4750]: I1008 19:15:02.965982 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt4fs\" (UniqueName: \"kubernetes.io/projected/a453be50-65d1-4bc8-a677-42456dc355d5-kube-api-access-kt4fs\") on node \"crc\" DevicePath \"\"" Oct 08 19:15:03 crc kubenswrapper[4750]: I1008 19:15:03.602391 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" event={"ID":"a453be50-65d1-4bc8-a677-42456dc355d5","Type":"ContainerDied","Data":"0525840f0686a6c1fab735d9a987c1f6cde8a50de3ba3c8badf70044626c01a8"} Oct 08 19:15:03 crc kubenswrapper[4750]: I1008 19:15:03.602459 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0525840f0686a6c1fab735d9a987c1f6cde8a50de3ba3c8badf70044626c01a8" Oct 08 19:15:03 crc kubenswrapper[4750]: I1008 19:15:03.602518 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr" Oct 08 19:15:03 crc kubenswrapper[4750]: I1008 19:15:03.929324 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl"] Oct 08 19:15:03 crc kubenswrapper[4750]: I1008 19:15:03.938068 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332470-q9tpl"] Oct 08 19:15:04 crc kubenswrapper[4750]: I1008 19:15:04.742495 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c9a37b-ca76-4dad-bbb1-adb67557d216" path="/var/lib/kubelet/pods/a4c9a37b-ca76-4dad-bbb1-adb67557d216/volumes" Oct 08 19:15:06 crc kubenswrapper[4750]: I1008 19:15:06.734209 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:15:06 crc kubenswrapper[4750]: E1008 19:15:06.734457 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:15:18 crc kubenswrapper[4750]: I1008 19:15:18.734766 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:15:18 crc kubenswrapper[4750]: E1008 19:15:18.736194 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:15:33 crc kubenswrapper[4750]: I1008 19:15:33.734863 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:15:33 crc kubenswrapper[4750]: E1008 19:15:33.735963 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:15:48 crc kubenswrapper[4750]: I1008 19:15:48.735593 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:15:48 crc kubenswrapper[4750]: E1008 19:15:48.737237 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.508442 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fjvtw"] Oct 08 19:15:51 crc kubenswrapper[4750]: E1008 19:15:51.509251 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a453be50-65d1-4bc8-a677-42456dc355d5" containerName="collect-profiles" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.509266 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a453be50-65d1-4bc8-a677-42456dc355d5" containerName="collect-profiles" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.509440 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a453be50-65d1-4bc8-a677-42456dc355d5" containerName="collect-profiles" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.510614 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.522218 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fjvtw"] Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.690569 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnfq6\" (UniqueName: \"kubernetes.io/projected/bdc46603-728e-4fcc-875f-a9a517aa8f70-kube-api-access-jnfq6\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.690685 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-utilities\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.690718 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-catalog-content\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.791815 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnfq6\" (UniqueName: \"kubernetes.io/projected/bdc46603-728e-4fcc-875f-a9a517aa8f70-kube-api-access-jnfq6\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.791899 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-utilities\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.791920 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-catalog-content\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.792465 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-utilities\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.792478 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-catalog-content\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.816831 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnfq6\" (UniqueName: \"kubernetes.io/projected/bdc46603-728e-4fcc-875f-a9a517aa8f70-kube-api-access-jnfq6\") pod \"community-operators-fjvtw\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:51 crc kubenswrapper[4750]: I1008 19:15:51.832522 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:15:52 crc kubenswrapper[4750]: I1008 19:15:52.171912 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fjvtw"] Oct 08 19:15:53 crc kubenswrapper[4750]: I1008 19:15:53.034192 4750 generic.go:334] "Generic (PLEG): container finished" podID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerID="3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726" exitCode=0 Oct 08 19:15:53 crc kubenswrapper[4750]: I1008 19:15:53.034261 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjvtw" event={"ID":"bdc46603-728e-4fcc-875f-a9a517aa8f70","Type":"ContainerDied","Data":"3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726"} Oct 08 19:15:53 crc kubenswrapper[4750]: I1008 19:15:53.034355 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjvtw" event={"ID":"bdc46603-728e-4fcc-875f-a9a517aa8f70","Type":"ContainerStarted","Data":"46c19f54df5660704abe25109707b5b4feb66b4dc22e812438abb842d7f551ae"} Oct 08 19:15:54 crc kubenswrapper[4750]: I1008 19:15:54.044751 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjvtw" event={"ID":"bdc46603-728e-4fcc-875f-a9a517aa8f70","Type":"ContainerStarted","Data":"fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa"} Oct 08 19:15:55 crc kubenswrapper[4750]: I1008 19:15:55.060198 4750 generic.go:334] "Generic (PLEG): container finished" podID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerID="fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa" exitCode=0 Oct 08 19:15:55 crc kubenswrapper[4750]: I1008 19:15:55.060488 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjvtw" event={"ID":"bdc46603-728e-4fcc-875f-a9a517aa8f70","Type":"ContainerDied","Data":"fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa"} Oct 08 19:15:56 crc kubenswrapper[4750]: I1008 19:15:56.080724 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjvtw" event={"ID":"bdc46603-728e-4fcc-875f-a9a517aa8f70","Type":"ContainerStarted","Data":"1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6"} Oct 08 19:15:56 crc kubenswrapper[4750]: I1008 19:15:56.108651 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fjvtw" podStartSLOduration=2.590895127 podStartE2EDuration="5.108628052s" podCreationTimestamp="2025-10-08 19:15:51 +0000 UTC" firstStartedPulling="2025-10-08 19:15:53.035786044 +0000 UTC m=+3908.948757057" lastFinishedPulling="2025-10-08 19:15:55.553518969 +0000 UTC m=+3911.466489982" observedRunningTime="2025-10-08 19:15:56.108124089 +0000 UTC m=+3912.021095092" watchObservedRunningTime="2025-10-08 19:15:56.108628052 +0000 UTC m=+3912.021599065" Oct 08 19:16:01 crc kubenswrapper[4750]: I1008 19:16:01.833379 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:16:01 crc kubenswrapper[4750]: I1008 19:16:01.834332 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:16:01 crc kubenswrapper[4750]: I1008 19:16:01.888783 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:16:02 crc kubenswrapper[4750]: I1008 19:16:02.175273 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:16:02 crc kubenswrapper[4750]: I1008 19:16:02.225415 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fjvtw"] Oct 08 19:16:02 crc kubenswrapper[4750]: I1008 19:16:02.232376 4750 scope.go:117] "RemoveContainer" containerID="1d894b37c5a500471f396c1cbbe6a415d7623a544a977aa562061f0f862047d7" Oct 08 19:16:03 crc kubenswrapper[4750]: I1008 19:16:03.734173 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:16:03 crc kubenswrapper[4750]: E1008 19:16:03.734926 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:16:04 crc kubenswrapper[4750]: I1008 19:16:04.146112 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fjvtw" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerName="registry-server" containerID="cri-o://1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6" gracePeriod=2 Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.060151 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.155397 4750 generic.go:334] "Generic (PLEG): container finished" podID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerID="1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6" exitCode=0 Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.155447 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjvtw" event={"ID":"bdc46603-728e-4fcc-875f-a9a517aa8f70","Type":"ContainerDied","Data":"1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6"} Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.155480 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjvtw" event={"ID":"bdc46603-728e-4fcc-875f-a9a517aa8f70","Type":"ContainerDied","Data":"46c19f54df5660704abe25109707b5b4feb66b4dc22e812438abb842d7f551ae"} Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.155501 4750 scope.go:117] "RemoveContainer" containerID="1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.155663 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjvtw" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.178985 4750 scope.go:117] "RemoveContainer" containerID="fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.200829 4750 scope.go:117] "RemoveContainer" containerID="3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.201192 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnfq6\" (UniqueName: \"kubernetes.io/projected/bdc46603-728e-4fcc-875f-a9a517aa8f70-kube-api-access-jnfq6\") pod \"bdc46603-728e-4fcc-875f-a9a517aa8f70\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.201264 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-utilities\") pod \"bdc46603-728e-4fcc-875f-a9a517aa8f70\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.201418 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-catalog-content\") pod \"bdc46603-728e-4fcc-875f-a9a517aa8f70\" (UID: \"bdc46603-728e-4fcc-875f-a9a517aa8f70\") " Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.202522 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-utilities" (OuterVolumeSpecName: "utilities") pod "bdc46603-728e-4fcc-875f-a9a517aa8f70" (UID: "bdc46603-728e-4fcc-875f-a9a517aa8f70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.206770 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc46603-728e-4fcc-875f-a9a517aa8f70-kube-api-access-jnfq6" (OuterVolumeSpecName: "kube-api-access-jnfq6") pod "bdc46603-728e-4fcc-875f-a9a517aa8f70" (UID: "bdc46603-728e-4fcc-875f-a9a517aa8f70"). InnerVolumeSpecName "kube-api-access-jnfq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.249922 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdc46603-728e-4fcc-875f-a9a517aa8f70" (UID: "bdc46603-728e-4fcc-875f-a9a517aa8f70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.252237 4750 scope.go:117] "RemoveContainer" containerID="1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6" Oct 08 19:16:05 crc kubenswrapper[4750]: E1008 19:16:05.252713 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6\": container with ID starting with 1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6 not found: ID does not exist" containerID="1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.252839 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6"} err="failed to get container status \"1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6\": rpc error: code = NotFound desc = could not find container \"1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6\": container with ID starting with 1ab3a6079d1bc683b3646c61d11857487955fb907fb8bcaf5da6537a640ee2c6 not found: ID does not exist" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.252928 4750 scope.go:117] "RemoveContainer" containerID="fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa" Oct 08 19:16:05 crc kubenswrapper[4750]: E1008 19:16:05.254032 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa\": container with ID starting with fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa not found: ID does not exist" containerID="fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.254084 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa"} err="failed to get container status \"fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa\": rpc error: code = NotFound desc = could not find container \"fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa\": container with ID starting with fb87de4f3b9c7a31921150a36508a713a3d27efdda11184b5597e502a91bb6aa not found: ID does not exist" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.254111 4750 scope.go:117] "RemoveContainer" containerID="3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726" Oct 08 19:16:05 crc kubenswrapper[4750]: E1008 19:16:05.254348 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726\": container with ID starting with 3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726 not found: ID does not exist" containerID="3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.254427 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726"} err="failed to get container status \"3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726\": rpc error: code = NotFound desc = could not find container \"3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726\": container with ID starting with 3ded2618772db4fab79db14d9acd16801e0eb3e22362e304a345614e46fed726 not found: ID does not exist" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.303597 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.303647 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnfq6\" (UniqueName: \"kubernetes.io/projected/bdc46603-728e-4fcc-875f-a9a517aa8f70-kube-api-access-jnfq6\") on node \"crc\" DevicePath \"\"" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.303667 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc46603-728e-4fcc-875f-a9a517aa8f70-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.491755 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fjvtw"] Oct 08 19:16:05 crc kubenswrapper[4750]: I1008 19:16:05.497434 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fjvtw"] Oct 08 19:16:06 crc kubenswrapper[4750]: I1008 19:16:06.746392 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" path="/var/lib/kubelet/pods/bdc46603-728e-4fcc-875f-a9a517aa8f70/volumes" Oct 08 19:16:14 crc kubenswrapper[4750]: I1008 19:16:14.740502 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:16:14 crc kubenswrapper[4750]: E1008 19:16:14.741698 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:16:25 crc kubenswrapper[4750]: I1008 19:16:25.734807 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:16:25 crc kubenswrapper[4750]: E1008 19:16:25.738613 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:16:36 crc kubenswrapper[4750]: I1008 19:16:36.734012 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:16:36 crc kubenswrapper[4750]: E1008 19:16:36.734869 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:16:48 crc kubenswrapper[4750]: I1008 19:16:48.735694 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:16:48 crc kubenswrapper[4750]: E1008 19:16:48.737358 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:17:00 crc kubenswrapper[4750]: I1008 19:17:00.735026 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:17:01 crc kubenswrapper[4750]: I1008 19:17:01.681742 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"0f70824c7542206dbd343a708ccfe13ae9ad076d7419088df7d1967e3c430ad3"} Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.122675 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cnqmk"] Oct 08 19:17:06 crc kubenswrapper[4750]: E1008 19:17:06.127156 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerName="extract-content" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.127190 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerName="extract-content" Oct 08 19:17:06 crc kubenswrapper[4750]: E1008 19:17:06.127220 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerName="extract-utilities" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.127227 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerName="extract-utilities" Oct 08 19:17:06 crc kubenswrapper[4750]: E1008 19:17:06.127246 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerName="registry-server" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.127253 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerName="registry-server" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.127425 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc46603-728e-4fcc-875f-a9a517aa8f70" containerName="registry-server" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.128804 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.143339 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cnqmk"] Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.208381 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-catalog-content\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.208589 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smxww\" (UniqueName: \"kubernetes.io/projected/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-kube-api-access-smxww\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.208650 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-utilities\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.310334 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smxww\" (UniqueName: \"kubernetes.io/projected/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-kube-api-access-smxww\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.310417 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-utilities\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.310496 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-catalog-content\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.311081 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-utilities\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.311194 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-catalog-content\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.333267 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smxww\" (UniqueName: \"kubernetes.io/projected/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-kube-api-access-smxww\") pod \"certified-operators-cnqmk\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.452824 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:06 crc kubenswrapper[4750]: I1008 19:17:06.768428 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cnqmk"] Oct 08 19:17:07 crc kubenswrapper[4750]: I1008 19:17:07.756174 4750 generic.go:334] "Generic (PLEG): container finished" podID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerID="7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3" exitCode=0 Oct 08 19:17:07 crc kubenswrapper[4750]: I1008 19:17:07.756256 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnqmk" event={"ID":"73092d2a-6a7a-4cae-85f4-bf36cb592f4d","Type":"ContainerDied","Data":"7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3"} Oct 08 19:17:07 crc kubenswrapper[4750]: I1008 19:17:07.756324 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnqmk" event={"ID":"73092d2a-6a7a-4cae-85f4-bf36cb592f4d","Type":"ContainerStarted","Data":"48ac3fd95f945d1ade0d7659c7cbe62d2d19303a816448bd0cfc16f81067b4fc"} Oct 08 19:17:08 crc kubenswrapper[4750]: I1008 19:17:08.766389 4750 generic.go:334] "Generic (PLEG): container finished" podID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerID="a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98" exitCode=0 Oct 08 19:17:08 crc kubenswrapper[4750]: I1008 19:17:08.766519 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnqmk" event={"ID":"73092d2a-6a7a-4cae-85f4-bf36cb592f4d","Type":"ContainerDied","Data":"a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98"} Oct 08 19:17:09 crc kubenswrapper[4750]: I1008 19:17:09.776103 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnqmk" event={"ID":"73092d2a-6a7a-4cae-85f4-bf36cb592f4d","Type":"ContainerStarted","Data":"03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac"} Oct 08 19:17:16 crc kubenswrapper[4750]: I1008 19:17:16.453008 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:16 crc kubenswrapper[4750]: I1008 19:17:16.455015 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:16 crc kubenswrapper[4750]: I1008 19:17:16.496600 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:16 crc kubenswrapper[4750]: I1008 19:17:16.519490 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cnqmk" podStartSLOduration=9.119558069 podStartE2EDuration="10.519468611s" podCreationTimestamp="2025-10-08 19:17:06 +0000 UTC" firstStartedPulling="2025-10-08 19:17:07.764669775 +0000 UTC m=+3983.677640828" lastFinishedPulling="2025-10-08 19:17:09.164580357 +0000 UTC m=+3985.077551370" observedRunningTime="2025-10-08 19:17:09.802973654 +0000 UTC m=+3985.715944697" watchObservedRunningTime="2025-10-08 19:17:16.519468611 +0000 UTC m=+3992.432439634" Oct 08 19:17:16 crc kubenswrapper[4750]: I1008 19:17:16.867965 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:16 crc kubenswrapper[4750]: I1008 19:17:16.918823 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cnqmk"] Oct 08 19:17:18 crc kubenswrapper[4750]: I1008 19:17:18.842184 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cnqmk" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerName="registry-server" containerID="cri-o://03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac" gracePeriod=2 Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.235177 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.331035 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smxww\" (UniqueName: \"kubernetes.io/projected/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-kube-api-access-smxww\") pod \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.331120 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-utilities\") pod \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.331194 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-catalog-content\") pod \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\" (UID: \"73092d2a-6a7a-4cae-85f4-bf36cb592f4d\") " Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.332464 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-utilities" (OuterVolumeSpecName: "utilities") pod "73092d2a-6a7a-4cae-85f4-bf36cb592f4d" (UID: "73092d2a-6a7a-4cae-85f4-bf36cb592f4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.337667 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-kube-api-access-smxww" (OuterVolumeSpecName: "kube-api-access-smxww") pod "73092d2a-6a7a-4cae-85f4-bf36cb592f4d" (UID: "73092d2a-6a7a-4cae-85f4-bf36cb592f4d"). InnerVolumeSpecName "kube-api-access-smxww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.374959 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73092d2a-6a7a-4cae-85f4-bf36cb592f4d" (UID: "73092d2a-6a7a-4cae-85f4-bf36cb592f4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.433328 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smxww\" (UniqueName: \"kubernetes.io/projected/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-kube-api-access-smxww\") on node \"crc\" DevicePath \"\"" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.433358 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.433368 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73092d2a-6a7a-4cae-85f4-bf36cb592f4d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.850110 4750 generic.go:334] "Generic (PLEG): container finished" podID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerID="03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac" exitCode=0 Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.850153 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnqmk" event={"ID":"73092d2a-6a7a-4cae-85f4-bf36cb592f4d","Type":"ContainerDied","Data":"03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac"} Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.850184 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnqmk" event={"ID":"73092d2a-6a7a-4cae-85f4-bf36cb592f4d","Type":"ContainerDied","Data":"48ac3fd95f945d1ade0d7659c7cbe62d2d19303a816448bd0cfc16f81067b4fc"} Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.850208 4750 scope.go:117] "RemoveContainer" containerID="03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.850329 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cnqmk" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.868505 4750 scope.go:117] "RemoveContainer" containerID="a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.896725 4750 scope.go:117] "RemoveContainer" containerID="7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.898917 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cnqmk"] Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.903388 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cnqmk"] Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.916470 4750 scope.go:117] "RemoveContainer" containerID="03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac" Oct 08 19:17:19 crc kubenswrapper[4750]: E1008 19:17:19.917066 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac\": container with ID starting with 03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac not found: ID does not exist" containerID="03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.917144 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac"} err="failed to get container status \"03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac\": rpc error: code = NotFound desc = could not find container \"03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac\": container with ID starting with 03d804816c7b726b66a674a4b692cefbe1b75b22c557fca80600bcdd156b60ac not found: ID does not exist" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.917169 4750 scope.go:117] "RemoveContainer" containerID="a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98" Oct 08 19:17:19 crc kubenswrapper[4750]: E1008 19:17:19.917795 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98\": container with ID starting with a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98 not found: ID does not exist" containerID="a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.917823 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98"} err="failed to get container status \"a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98\": rpc error: code = NotFound desc = could not find container \"a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98\": container with ID starting with a890d4437df11aa8e7b62e057acf96a38c74d37932f485ec9bdb94b258ca2d98 not found: ID does not exist" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.917861 4750 scope.go:117] "RemoveContainer" containerID="7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3" Oct 08 19:17:19 crc kubenswrapper[4750]: E1008 19:17:19.918196 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3\": container with ID starting with 7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3 not found: ID does not exist" containerID="7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3" Oct 08 19:17:19 crc kubenswrapper[4750]: I1008 19:17:19.918221 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3"} err="failed to get container status \"7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3\": rpc error: code = NotFound desc = could not find container \"7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3\": container with ID starting with 7228304521a329c1db109ea5afee7c24de75db9ae6216de4c94006faec7052f3 not found: ID does not exist" Oct 08 19:17:20 crc kubenswrapper[4750]: I1008 19:17:20.745657 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" path="/var/lib/kubelet/pods/73092d2a-6a7a-4cae-85f4-bf36cb592f4d/volumes" Oct 08 19:19:29 crc kubenswrapper[4750]: I1008 19:19:29.707279 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:19:29 crc kubenswrapper[4750]: I1008 19:19:29.708035 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:19:59 crc kubenswrapper[4750]: I1008 19:19:59.706877 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:19:59 crc kubenswrapper[4750]: I1008 19:19:59.707706 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:20:29 crc kubenswrapper[4750]: I1008 19:20:29.707062 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:20:29 crc kubenswrapper[4750]: I1008 19:20:29.708772 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:20:29 crc kubenswrapper[4750]: I1008 19:20:29.708867 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:20:29 crc kubenswrapper[4750]: I1008 19:20:29.710048 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f70824c7542206dbd343a708ccfe13ae9ad076d7419088df7d1967e3c430ad3"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:20:29 crc kubenswrapper[4750]: I1008 19:20:29.710167 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://0f70824c7542206dbd343a708ccfe13ae9ad076d7419088df7d1967e3c430ad3" gracePeriod=600 Oct 08 19:20:30 crc kubenswrapper[4750]: I1008 19:20:30.550530 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="0f70824c7542206dbd343a708ccfe13ae9ad076d7419088df7d1967e3c430ad3" exitCode=0 Oct 08 19:20:30 crc kubenswrapper[4750]: I1008 19:20:30.550736 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"0f70824c7542206dbd343a708ccfe13ae9ad076d7419088df7d1967e3c430ad3"} Oct 08 19:20:30 crc kubenswrapper[4750]: I1008 19:20:30.550896 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc"} Oct 08 19:20:30 crc kubenswrapper[4750]: I1008 19:20:30.550914 4750 scope.go:117] "RemoveContainer" containerID="4778468c7d5eb2d32f1ce6d0e884666388775f4c788e8eea1669c69aa0822658" Oct 08 19:22:59 crc kubenswrapper[4750]: I1008 19:22:59.707458 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:22:59 crc kubenswrapper[4750]: I1008 19:22:59.708199 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:23:29 crc kubenswrapper[4750]: I1008 19:23:29.707571 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:23:29 crc kubenswrapper[4750]: I1008 19:23:29.708587 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:23:59 crc kubenswrapper[4750]: I1008 19:23:59.706866 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:23:59 crc kubenswrapper[4750]: I1008 19:23:59.707934 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:23:59 crc kubenswrapper[4750]: I1008 19:23:59.708015 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:23:59 crc kubenswrapper[4750]: I1008 19:23:59.709353 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:23:59 crc kubenswrapper[4750]: I1008 19:23:59.709460 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" gracePeriod=600 Oct 08 19:23:59 crc kubenswrapper[4750]: E1008 19:23:59.837965 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:24:00 crc kubenswrapper[4750]: I1008 19:24:00.358520 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" exitCode=0 Oct 08 19:24:00 crc kubenswrapper[4750]: I1008 19:24:00.358594 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc"} Oct 08 19:24:00 crc kubenswrapper[4750]: I1008 19:24:00.358645 4750 scope.go:117] "RemoveContainer" containerID="0f70824c7542206dbd343a708ccfe13ae9ad076d7419088df7d1967e3c430ad3" Oct 08 19:24:00 crc kubenswrapper[4750]: I1008 19:24:00.359706 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:24:00 crc kubenswrapper[4750]: E1008 19:24:00.360162 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:24:12 crc kubenswrapper[4750]: I1008 19:24:12.735744 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:24:12 crc kubenswrapper[4750]: E1008 19:24:12.737269 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:24:25 crc kubenswrapper[4750]: I1008 19:24:25.735618 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:24:25 crc kubenswrapper[4750]: E1008 19:24:25.736911 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:24:36 crc kubenswrapper[4750]: I1008 19:24:36.735137 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:24:36 crc kubenswrapper[4750]: E1008 19:24:36.736862 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:24:51 crc kubenswrapper[4750]: I1008 19:24:51.734586 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:24:51 crc kubenswrapper[4750]: E1008 19:24:51.735307 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.657220 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4n7l4"] Oct 08 19:24:54 crc kubenswrapper[4750]: E1008 19:24:54.658192 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerName="registry-server" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.658207 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerName="registry-server" Oct 08 19:24:54 crc kubenswrapper[4750]: E1008 19:24:54.658232 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerName="extract-utilities" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.658241 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerName="extract-utilities" Oct 08 19:24:54 crc kubenswrapper[4750]: E1008 19:24:54.658253 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerName="extract-content" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.658260 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerName="extract-content" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.658396 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="73092d2a-6a7a-4cae-85f4-bf36cb592f4d" containerName="registry-server" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.659730 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.680853 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n7l4"] Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.772277 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-catalog-content\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.772479 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-utilities\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.772617 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhzx\" (UniqueName: \"kubernetes.io/projected/e4cdbedb-7f69-4647-be75-d07f953493e6-kube-api-access-mjhzx\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.863705 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6rqpp"] Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.865727 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.873964 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-utilities\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.874056 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhzx\" (UniqueName: \"kubernetes.io/projected/e4cdbedb-7f69-4647-be75-d07f953493e6-kube-api-access-mjhzx\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.874123 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-catalog-content\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.874817 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-utilities\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.875176 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-catalog-content\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.885386 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rqpp"] Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.908249 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhzx\" (UniqueName: \"kubernetes.io/projected/e4cdbedb-7f69-4647-be75-d07f953493e6-kube-api-access-mjhzx\") pod \"redhat-marketplace-4n7l4\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.975817 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-utilities\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.975891 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-catalog-content\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.975925 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b47b\" (UniqueName: \"kubernetes.io/projected/6b93be46-f5c7-4fda-b8b7-6c8038de0308-kube-api-access-6b47b\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:54 crc kubenswrapper[4750]: I1008 19:24:54.998438 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.077758 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-utilities\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.077852 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-catalog-content\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.077887 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b47b\" (UniqueName: \"kubernetes.io/projected/6b93be46-f5c7-4fda-b8b7-6c8038de0308-kube-api-access-6b47b\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.078750 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-utilities\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.078801 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-catalog-content\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.100428 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b47b\" (UniqueName: \"kubernetes.io/projected/6b93be46-f5c7-4fda-b8b7-6c8038de0308-kube-api-access-6b47b\") pod \"redhat-operators-6rqpp\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.188881 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.494145 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n7l4"] Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.664782 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rqpp"] Oct 08 19:24:55 crc kubenswrapper[4750]: W1008 19:24:55.669081 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b93be46_f5c7_4fda_b8b7_6c8038de0308.slice/crio-9f261bd14a655f4102ba89cdc799ab5ee24f2804a6f5b60d6e1249bcf92d1fc9 WatchSource:0}: Error finding container 9f261bd14a655f4102ba89cdc799ab5ee24f2804a6f5b60d6e1249bcf92d1fc9: Status 404 returned error can't find the container with id 9f261bd14a655f4102ba89cdc799ab5ee24f2804a6f5b60d6e1249bcf92d1fc9 Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.994934 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerID="8396ad2f368a497365aea1763c9497d454cc828a77e18626ab8ab0733a2a1f24" exitCode=0 Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.995481 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rqpp" event={"ID":"6b93be46-f5c7-4fda-b8b7-6c8038de0308","Type":"ContainerDied","Data":"8396ad2f368a497365aea1763c9497d454cc828a77e18626ab8ab0733a2a1f24"} Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.995537 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rqpp" event={"ID":"6b93be46-f5c7-4fda-b8b7-6c8038de0308","Type":"ContainerStarted","Data":"9f261bd14a655f4102ba89cdc799ab5ee24f2804a6f5b60d6e1249bcf92d1fc9"} Oct 08 19:24:55 crc kubenswrapper[4750]: I1008 19:24:55.999751 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:24:56 crc kubenswrapper[4750]: I1008 19:24:56.000003 4750 generic.go:334] "Generic (PLEG): container finished" podID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerID="eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba" exitCode=0 Oct 08 19:24:56 crc kubenswrapper[4750]: I1008 19:24:56.000068 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n7l4" event={"ID":"e4cdbedb-7f69-4647-be75-d07f953493e6","Type":"ContainerDied","Data":"eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba"} Oct 08 19:24:56 crc kubenswrapper[4750]: I1008 19:24:56.000107 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n7l4" event={"ID":"e4cdbedb-7f69-4647-be75-d07f953493e6","Type":"ContainerStarted","Data":"b0f0d6c81d3ccd19225584a111fe935b58da400f1e5f02545397d48902191152"} Oct 08 19:24:58 crc kubenswrapper[4750]: I1008 19:24:58.025341 4750 generic.go:334] "Generic (PLEG): container finished" podID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerID="2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861" exitCode=0 Oct 08 19:24:58 crc kubenswrapper[4750]: I1008 19:24:58.025438 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n7l4" event={"ID":"e4cdbedb-7f69-4647-be75-d07f953493e6","Type":"ContainerDied","Data":"2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861"} Oct 08 19:24:58 crc kubenswrapper[4750]: I1008 19:24:58.028463 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerID="ddb010a216f7f6efa4993dde429f0862ca1445b120ee1f58ae5f566328c6b13e" exitCode=0 Oct 08 19:24:58 crc kubenswrapper[4750]: I1008 19:24:58.028570 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rqpp" event={"ID":"6b93be46-f5c7-4fda-b8b7-6c8038de0308","Type":"ContainerDied","Data":"ddb010a216f7f6efa4993dde429f0862ca1445b120ee1f58ae5f566328c6b13e"} Oct 08 19:25:00 crc kubenswrapper[4750]: I1008 19:25:00.049306 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n7l4" event={"ID":"e4cdbedb-7f69-4647-be75-d07f953493e6","Type":"ContainerStarted","Data":"384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d"} Oct 08 19:25:00 crc kubenswrapper[4750]: I1008 19:25:00.054390 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rqpp" event={"ID":"6b93be46-f5c7-4fda-b8b7-6c8038de0308","Type":"ContainerStarted","Data":"f3254645152e20070bd347036faad62f940b16fc3e8479c11d30c7158ed0d09b"} Oct 08 19:25:00 crc kubenswrapper[4750]: I1008 19:25:00.073605 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4n7l4" podStartSLOduration=3.597059391 podStartE2EDuration="6.07358036s" podCreationTimestamp="2025-10-08 19:24:54 +0000 UTC" firstStartedPulling="2025-10-08 19:24:56.001745279 +0000 UTC m=+4451.914716312" lastFinishedPulling="2025-10-08 19:24:58.478266258 +0000 UTC m=+4454.391237281" observedRunningTime="2025-10-08 19:25:00.068287228 +0000 UTC m=+4455.981258261" watchObservedRunningTime="2025-10-08 19:25:00.07358036 +0000 UTC m=+4455.986551363" Oct 08 19:25:00 crc kubenswrapper[4750]: I1008 19:25:00.092823 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6rqpp" podStartSLOduration=3.651751918 podStartE2EDuration="6.092803791s" podCreationTimestamp="2025-10-08 19:24:54 +0000 UTC" firstStartedPulling="2025-10-08 19:24:55.999093023 +0000 UTC m=+4451.912064056" lastFinishedPulling="2025-10-08 19:24:58.440144916 +0000 UTC m=+4454.353115929" observedRunningTime="2025-10-08 19:25:00.090036192 +0000 UTC m=+4456.003007215" watchObservedRunningTime="2025-10-08 19:25:00.092803791 +0000 UTC m=+4456.005774814" Oct 08 19:25:02 crc kubenswrapper[4750]: I1008 19:25:02.734211 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:25:02 crc kubenswrapper[4750]: E1008 19:25:02.735011 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:25:04 crc kubenswrapper[4750]: I1008 19:25:04.999085 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:25:04 crc kubenswrapper[4750]: I1008 19:25:04.999746 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:25:05 crc kubenswrapper[4750]: I1008 19:25:05.067939 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:25:05 crc kubenswrapper[4750]: I1008 19:25:05.154234 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:25:05 crc kubenswrapper[4750]: I1008 19:25:05.189400 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:25:05 crc kubenswrapper[4750]: I1008 19:25:05.189458 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:25:05 crc kubenswrapper[4750]: I1008 19:25:05.233979 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:25:05 crc kubenswrapper[4750]: I1008 19:25:05.450490 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n7l4"] Oct 08 19:25:06 crc kubenswrapper[4750]: I1008 19:25:06.178983 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.120091 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4n7l4" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerName="registry-server" containerID="cri-o://384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d" gracePeriod=2 Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.604770 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.644727 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rqpp"] Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.756904 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-catalog-content\") pod \"e4cdbedb-7f69-4647-be75-d07f953493e6\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.757017 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjhzx\" (UniqueName: \"kubernetes.io/projected/e4cdbedb-7f69-4647-be75-d07f953493e6-kube-api-access-mjhzx\") pod \"e4cdbedb-7f69-4647-be75-d07f953493e6\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.757086 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-utilities\") pod \"e4cdbedb-7f69-4647-be75-d07f953493e6\" (UID: \"e4cdbedb-7f69-4647-be75-d07f953493e6\") " Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.758830 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-utilities" (OuterVolumeSpecName: "utilities") pod "e4cdbedb-7f69-4647-be75-d07f953493e6" (UID: "e4cdbedb-7f69-4647-be75-d07f953493e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.770488 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4cdbedb-7f69-4647-be75-d07f953493e6-kube-api-access-mjhzx" (OuterVolumeSpecName: "kube-api-access-mjhzx") pod "e4cdbedb-7f69-4647-be75-d07f953493e6" (UID: "e4cdbedb-7f69-4647-be75-d07f953493e6"). InnerVolumeSpecName "kube-api-access-mjhzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.778481 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4cdbedb-7f69-4647-be75-d07f953493e6" (UID: "e4cdbedb-7f69-4647-be75-d07f953493e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.858873 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.858915 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4cdbedb-7f69-4647-be75-d07f953493e6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:25:07 crc kubenswrapper[4750]: I1008 19:25:07.858931 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjhzx\" (UniqueName: \"kubernetes.io/projected/e4cdbedb-7f69-4647-be75-d07f953493e6-kube-api-access-mjhzx\") on node \"crc\" DevicePath \"\"" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.140410 4750 generic.go:334] "Generic (PLEG): container finished" podID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerID="384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d" exitCode=0 Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.140522 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n7l4" event={"ID":"e4cdbedb-7f69-4647-be75-d07f953493e6","Type":"ContainerDied","Data":"384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d"} Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.140625 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n7l4" event={"ID":"e4cdbedb-7f69-4647-be75-d07f953493e6","Type":"ContainerDied","Data":"b0f0d6c81d3ccd19225584a111fe935b58da400f1e5f02545397d48902191152"} Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.140656 4750 scope.go:117] "RemoveContainer" containerID="384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.140571 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n7l4" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.140811 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6rqpp" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerName="registry-server" containerID="cri-o://f3254645152e20070bd347036faad62f940b16fc3e8479c11d30c7158ed0d09b" gracePeriod=2 Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.176355 4750 scope.go:117] "RemoveContainer" containerID="2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.204131 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n7l4"] Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.211977 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n7l4"] Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.213147 4750 scope.go:117] "RemoveContainer" containerID="eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.244722 4750 scope.go:117] "RemoveContainer" containerID="384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d" Oct 08 19:25:08 crc kubenswrapper[4750]: E1008 19:25:08.245525 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d\": container with ID starting with 384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d not found: ID does not exist" containerID="384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.245660 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d"} err="failed to get container status \"384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d\": rpc error: code = NotFound desc = could not find container \"384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d\": container with ID starting with 384c89029d5ba3e3307c132b8efabb7dc1f24030312a42d877986ec19fc6b94d not found: ID does not exist" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.245705 4750 scope.go:117] "RemoveContainer" containerID="2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861" Oct 08 19:25:08 crc kubenswrapper[4750]: E1008 19:25:08.246299 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861\": container with ID starting with 2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861 not found: ID does not exist" containerID="2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.246333 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861"} err="failed to get container status \"2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861\": rpc error: code = NotFound desc = could not find container \"2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861\": container with ID starting with 2f6a8b7c2c72af27109d03eedcac0237997acaf3c49160c6668289d118d9a861 not found: ID does not exist" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.246363 4750 scope.go:117] "RemoveContainer" containerID="eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba" Oct 08 19:25:08 crc kubenswrapper[4750]: E1008 19:25:08.246752 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba\": container with ID starting with eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba not found: ID does not exist" containerID="eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.246814 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba"} err="failed to get container status \"eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba\": rpc error: code = NotFound desc = could not find container \"eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba\": container with ID starting with eced70840c9b508bf67dfdb5455ed40b25783994937b940becb1fa8cbbb3d6ba not found: ID does not exist" Oct 08 19:25:08 crc kubenswrapper[4750]: I1008 19:25:08.749982 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" path="/var/lib/kubelet/pods/e4cdbedb-7f69-4647-be75-d07f953493e6/volumes" Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.153820 4750 generic.go:334] "Generic (PLEG): container finished" podID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerID="f3254645152e20070bd347036faad62f940b16fc3e8479c11d30c7158ed0d09b" exitCode=0 Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.153885 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rqpp" event={"ID":"6b93be46-f5c7-4fda-b8b7-6c8038de0308","Type":"ContainerDied","Data":"f3254645152e20070bd347036faad62f940b16fc3e8479c11d30c7158ed0d09b"} Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.259660 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.285981 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b47b\" (UniqueName: \"kubernetes.io/projected/6b93be46-f5c7-4fda-b8b7-6c8038de0308-kube-api-access-6b47b\") pod \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.286108 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-utilities\") pod \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.286132 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-catalog-content\") pod \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\" (UID: \"6b93be46-f5c7-4fda-b8b7-6c8038de0308\") " Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.288514 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-utilities" (OuterVolumeSpecName: "utilities") pod "6b93be46-f5c7-4fda-b8b7-6c8038de0308" (UID: "6b93be46-f5c7-4fda-b8b7-6c8038de0308"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.292596 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93be46-f5c7-4fda-b8b7-6c8038de0308-kube-api-access-6b47b" (OuterVolumeSpecName: "kube-api-access-6b47b") pod "6b93be46-f5c7-4fda-b8b7-6c8038de0308" (UID: "6b93be46-f5c7-4fda-b8b7-6c8038de0308"). InnerVolumeSpecName "kube-api-access-6b47b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.388139 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b47b\" (UniqueName: \"kubernetes.io/projected/6b93be46-f5c7-4fda-b8b7-6c8038de0308-kube-api-access-6b47b\") on node \"crc\" DevicePath \"\"" Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.388186 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.402375 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b93be46-f5c7-4fda-b8b7-6c8038de0308" (UID: "6b93be46-f5c7-4fda-b8b7-6c8038de0308"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:25:09 crc kubenswrapper[4750]: I1008 19:25:09.490256 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b93be46-f5c7-4fda-b8b7-6c8038de0308-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:25:10 crc kubenswrapper[4750]: I1008 19:25:10.173074 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rqpp" event={"ID":"6b93be46-f5c7-4fda-b8b7-6c8038de0308","Type":"ContainerDied","Data":"9f261bd14a655f4102ba89cdc799ab5ee24f2804a6f5b60d6e1249bcf92d1fc9"} Oct 08 19:25:10 crc kubenswrapper[4750]: I1008 19:25:10.173603 4750 scope.go:117] "RemoveContainer" containerID="f3254645152e20070bd347036faad62f940b16fc3e8479c11d30c7158ed0d09b" Oct 08 19:25:10 crc kubenswrapper[4750]: I1008 19:25:10.173174 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rqpp" Oct 08 19:25:10 crc kubenswrapper[4750]: I1008 19:25:10.226136 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rqpp"] Oct 08 19:25:10 crc kubenswrapper[4750]: I1008 19:25:10.239063 4750 scope.go:117] "RemoveContainer" containerID="ddb010a216f7f6efa4993dde429f0862ca1445b120ee1f58ae5f566328c6b13e" Oct 08 19:25:10 crc kubenswrapper[4750]: I1008 19:25:10.240402 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6rqpp"] Oct 08 19:25:10 crc kubenswrapper[4750]: I1008 19:25:10.277613 4750 scope.go:117] "RemoveContainer" containerID="8396ad2f368a497365aea1763c9497d454cc828a77e18626ab8ab0733a2a1f24" Oct 08 19:25:10 crc kubenswrapper[4750]: I1008 19:25:10.755411 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" path="/var/lib/kubelet/pods/6b93be46-f5c7-4fda-b8b7-6c8038de0308/volumes" Oct 08 19:25:17 crc kubenswrapper[4750]: I1008 19:25:17.736021 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:25:17 crc kubenswrapper[4750]: E1008 19:25:17.737340 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:25:32 crc kubenswrapper[4750]: I1008 19:25:32.733911 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:25:32 crc kubenswrapper[4750]: E1008 19:25:32.734836 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:25:47 crc kubenswrapper[4750]: I1008 19:25:47.734437 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:25:47 crc kubenswrapper[4750]: E1008 19:25:47.735469 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:26:01 crc kubenswrapper[4750]: I1008 19:26:01.734828 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:26:01 crc kubenswrapper[4750]: E1008 19:26:01.736170 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.327670 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9dljd"] Oct 08 19:26:12 crc kubenswrapper[4750]: E1008 19:26:12.329874 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerName="extract-content" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.330078 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerName="extract-content" Oct 08 19:26:12 crc kubenswrapper[4750]: E1008 19:26:12.330153 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerName="registry-server" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.330212 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerName="registry-server" Oct 08 19:26:12 crc kubenswrapper[4750]: E1008 19:26:12.330288 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerName="extract-utilities" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.330347 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerName="extract-utilities" Oct 08 19:26:12 crc kubenswrapper[4750]: E1008 19:26:12.330414 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerName="extract-content" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.330479 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerName="extract-content" Oct 08 19:26:12 crc kubenswrapper[4750]: E1008 19:26:12.332050 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerName="registry-server" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.332092 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerName="registry-server" Oct 08 19:26:12 crc kubenswrapper[4750]: E1008 19:26:12.332127 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerName="extract-utilities" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.332175 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerName="extract-utilities" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.332668 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b93be46-f5c7-4fda-b8b7-6c8038de0308" containerName="registry-server" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.332699 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cdbedb-7f69-4647-be75-d07f953493e6" containerName="registry-server" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.334036 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.342350 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dljd"] Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.503724 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbl6j\" (UniqueName: \"kubernetes.io/projected/52183c04-85c3-4c74-999c-458d313d9fd8-kube-api-access-wbl6j\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.503918 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-catalog-content\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.504132 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-utilities\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.605533 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-utilities\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.605630 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbl6j\" (UniqueName: \"kubernetes.io/projected/52183c04-85c3-4c74-999c-458d313d9fd8-kube-api-access-wbl6j\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.605686 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-catalog-content\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.607053 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-catalog-content\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.607099 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-utilities\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.629962 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbl6j\" (UniqueName: \"kubernetes.io/projected/52183c04-85c3-4c74-999c-458d313d9fd8-kube-api-access-wbl6j\") pod \"community-operators-9dljd\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:12 crc kubenswrapper[4750]: I1008 19:26:12.661148 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:13 crc kubenswrapper[4750]: I1008 19:26:13.220442 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dljd"] Oct 08 19:26:13 crc kubenswrapper[4750]: I1008 19:26:13.807988 4750 generic.go:334] "Generic (PLEG): container finished" podID="52183c04-85c3-4c74-999c-458d313d9fd8" containerID="43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c" exitCode=0 Oct 08 19:26:13 crc kubenswrapper[4750]: I1008 19:26:13.808103 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dljd" event={"ID":"52183c04-85c3-4c74-999c-458d313d9fd8","Type":"ContainerDied","Data":"43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c"} Oct 08 19:26:13 crc kubenswrapper[4750]: I1008 19:26:13.808682 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dljd" event={"ID":"52183c04-85c3-4c74-999c-458d313d9fd8","Type":"ContainerStarted","Data":"4ee170fb9f3156dac0658b05099204664c2782f9673e85a95ec326f4295c3f8d"} Oct 08 19:26:14 crc kubenswrapper[4750]: I1008 19:26:14.820095 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dljd" event={"ID":"52183c04-85c3-4c74-999c-458d313d9fd8","Type":"ContainerStarted","Data":"a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f"} Oct 08 19:26:15 crc kubenswrapper[4750]: I1008 19:26:15.829608 4750 generic.go:334] "Generic (PLEG): container finished" podID="52183c04-85c3-4c74-999c-458d313d9fd8" containerID="a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f" exitCode=0 Oct 08 19:26:15 crc kubenswrapper[4750]: I1008 19:26:15.829681 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dljd" event={"ID":"52183c04-85c3-4c74-999c-458d313d9fd8","Type":"ContainerDied","Data":"a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f"} Oct 08 19:26:16 crc kubenswrapper[4750]: I1008 19:26:16.734683 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:26:16 crc kubenswrapper[4750]: E1008 19:26:16.735514 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:26:16 crc kubenswrapper[4750]: I1008 19:26:16.841119 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dljd" event={"ID":"52183c04-85c3-4c74-999c-458d313d9fd8","Type":"ContainerStarted","Data":"aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b"} Oct 08 19:26:16 crc kubenswrapper[4750]: I1008 19:26:16.869976 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9dljd" podStartSLOduration=2.344873152 podStartE2EDuration="4.869943496s" podCreationTimestamp="2025-10-08 19:26:12 +0000 UTC" firstStartedPulling="2025-10-08 19:26:13.809924756 +0000 UTC m=+4529.722895779" lastFinishedPulling="2025-10-08 19:26:16.33499511 +0000 UTC m=+4532.247966123" observedRunningTime="2025-10-08 19:26:16.85931297 +0000 UTC m=+4532.772284023" watchObservedRunningTime="2025-10-08 19:26:16.869943496 +0000 UTC m=+4532.782914539" Oct 08 19:26:22 crc kubenswrapper[4750]: I1008 19:26:22.661772 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:22 crc kubenswrapper[4750]: I1008 19:26:22.662404 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:22 crc kubenswrapper[4750]: I1008 19:26:22.720403 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:22 crc kubenswrapper[4750]: I1008 19:26:22.934148 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:22 crc kubenswrapper[4750]: I1008 19:26:22.982995 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dljd"] Oct 08 19:26:24 crc kubenswrapper[4750]: I1008 19:26:24.909967 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9dljd" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" containerName="registry-server" containerID="cri-o://aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b" gracePeriod=2 Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.336718 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.425470 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbl6j\" (UniqueName: \"kubernetes.io/projected/52183c04-85c3-4c74-999c-458d313d9fd8-kube-api-access-wbl6j\") pod \"52183c04-85c3-4c74-999c-458d313d9fd8\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.425689 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-utilities\") pod \"52183c04-85c3-4c74-999c-458d313d9fd8\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.425832 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-catalog-content\") pod \"52183c04-85c3-4c74-999c-458d313d9fd8\" (UID: \"52183c04-85c3-4c74-999c-458d313d9fd8\") " Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.427664 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-utilities" (OuterVolumeSpecName: "utilities") pod "52183c04-85c3-4c74-999c-458d313d9fd8" (UID: "52183c04-85c3-4c74-999c-458d313d9fd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.433252 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52183c04-85c3-4c74-999c-458d313d9fd8-kube-api-access-wbl6j" (OuterVolumeSpecName: "kube-api-access-wbl6j") pod "52183c04-85c3-4c74-999c-458d313d9fd8" (UID: "52183c04-85c3-4c74-999c-458d313d9fd8"). InnerVolumeSpecName "kube-api-access-wbl6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.493855 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52183c04-85c3-4c74-999c-458d313d9fd8" (UID: "52183c04-85c3-4c74-999c-458d313d9fd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.528625 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.528691 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52183c04-85c3-4c74-999c-458d313d9fd8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.528710 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbl6j\" (UniqueName: \"kubernetes.io/projected/52183c04-85c3-4c74-999c-458d313d9fd8-kube-api-access-wbl6j\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.923604 4750 generic.go:334] "Generic (PLEG): container finished" podID="52183c04-85c3-4c74-999c-458d313d9fd8" containerID="aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b" exitCode=0 Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.923686 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dljd" event={"ID":"52183c04-85c3-4c74-999c-458d313d9fd8","Type":"ContainerDied","Data":"aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b"} Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.923752 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dljd" event={"ID":"52183c04-85c3-4c74-999c-458d313d9fd8","Type":"ContainerDied","Data":"4ee170fb9f3156dac0658b05099204664c2782f9673e85a95ec326f4295c3f8d"} Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.923762 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dljd" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.923774 4750 scope.go:117] "RemoveContainer" containerID="aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.945661 4750 scope.go:117] "RemoveContainer" containerID="a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.976071 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dljd"] Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.979882 4750 scope.go:117] "RemoveContainer" containerID="43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c" Oct 08 19:26:25 crc kubenswrapper[4750]: I1008 19:26:25.985465 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9dljd"] Oct 08 19:26:26 crc kubenswrapper[4750]: I1008 19:26:26.010074 4750 scope.go:117] "RemoveContainer" containerID="aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b" Oct 08 19:26:26 crc kubenswrapper[4750]: E1008 19:26:26.010881 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b\": container with ID starting with aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b not found: ID does not exist" containerID="aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b" Oct 08 19:26:26 crc kubenswrapper[4750]: I1008 19:26:26.010940 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b"} err="failed to get container status \"aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b\": rpc error: code = NotFound desc = could not find container \"aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b\": container with ID starting with aeca7eb3f67131b4899ef5a743acbf2f62911fe38702f402d6a515231907677b not found: ID does not exist" Oct 08 19:26:26 crc kubenswrapper[4750]: I1008 19:26:26.010972 4750 scope.go:117] "RemoveContainer" containerID="a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f" Oct 08 19:26:26 crc kubenswrapper[4750]: E1008 19:26:26.011330 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f\": container with ID starting with a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f not found: ID does not exist" containerID="a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f" Oct 08 19:26:26 crc kubenswrapper[4750]: I1008 19:26:26.011358 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f"} err="failed to get container status \"a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f\": rpc error: code = NotFound desc = could not find container \"a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f\": container with ID starting with a125ccf70260efbb356328f8f96b079bd44bd1e41eeea2b4ff7e0f2b9146888f not found: ID does not exist" Oct 08 19:26:26 crc kubenswrapper[4750]: I1008 19:26:26.011390 4750 scope.go:117] "RemoveContainer" containerID="43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c" Oct 08 19:26:26 crc kubenswrapper[4750]: E1008 19:26:26.011896 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c\": container with ID starting with 43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c not found: ID does not exist" containerID="43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c" Oct 08 19:26:26 crc kubenswrapper[4750]: I1008 19:26:26.011921 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c"} err="failed to get container status \"43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c\": rpc error: code = NotFound desc = could not find container \"43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c\": container with ID starting with 43a33dd63594e9294a43d9d0c6aaca699d99c134c9e11948f701e767961f806c not found: ID does not exist" Oct 08 19:26:26 crc kubenswrapper[4750]: I1008 19:26:26.755601 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" path="/var/lib/kubelet/pods/52183c04-85c3-4c74-999c-458d313d9fd8/volumes" Oct 08 19:26:31 crc kubenswrapper[4750]: I1008 19:26:31.738063 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:26:31 crc kubenswrapper[4750]: E1008 19:26:31.739138 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:26:40 crc kubenswrapper[4750]: I1008 19:26:40.972506 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-p4srs"] Oct 08 19:26:40 crc kubenswrapper[4750]: I1008 19:26:40.977054 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-p4srs"] Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.130981 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-2hwgr"] Oct 08 19:26:41 crc kubenswrapper[4750]: E1008 19:26:41.131538 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" containerName="extract-content" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.131608 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" containerName="extract-content" Oct 08 19:26:41 crc kubenswrapper[4750]: E1008 19:26:41.131654 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" containerName="extract-utilities" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.131665 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" containerName="extract-utilities" Oct 08 19:26:41 crc kubenswrapper[4750]: E1008 19:26:41.131679 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" containerName="registry-server" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.131689 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" containerName="registry-server" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.131879 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="52183c04-85c3-4c74-999c-458d313d9fd8" containerName="registry-server" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.132689 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.135884 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.135895 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.136237 4750 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-nchbf" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.136515 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.141750 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2hwgr"] Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.206586 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgv49\" (UniqueName: \"kubernetes.io/projected/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-kube-api-access-hgv49\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.206643 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-crc-storage\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.206934 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-node-mnt\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.308886 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgv49\" (UniqueName: \"kubernetes.io/projected/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-kube-api-access-hgv49\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.308946 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-crc-storage\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.309963 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-crc-storage\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.310087 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-node-mnt\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.310419 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-node-mnt\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.333574 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgv49\" (UniqueName: \"kubernetes.io/projected/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-kube-api-access-hgv49\") pod \"crc-storage-crc-2hwgr\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.464433 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:41 crc kubenswrapper[4750]: I1008 19:26:41.983272 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2hwgr"] Oct 08 19:26:42 crc kubenswrapper[4750]: I1008 19:26:42.082786 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2hwgr" event={"ID":"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b","Type":"ContainerStarted","Data":"3e93ea4c9f6b35095f25761cc8541aea04a66181b67e140e70eb1c3555f8367d"} Oct 08 19:26:42 crc kubenswrapper[4750]: I1008 19:26:42.778064 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cddc68-0cc7-4d38-ab9d-cb01de038724" path="/var/lib/kubelet/pods/42cddc68-0cc7-4d38-ab9d-cb01de038724/volumes" Oct 08 19:26:43 crc kubenswrapper[4750]: I1008 19:26:43.095381 4750 generic.go:334] "Generic (PLEG): container finished" podID="f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b" containerID="fcfc556e6b64a14753ba1448420aa325a1d13206de48f3ab2121b7dcfdb22491" exitCode=0 Oct 08 19:26:43 crc kubenswrapper[4750]: I1008 19:26:43.095536 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2hwgr" event={"ID":"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b","Type":"ContainerDied","Data":"fcfc556e6b64a14753ba1448420aa325a1d13206de48f3ab2121b7dcfdb22491"} Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.459250 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.584069 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-crc-storage\") pod \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.584186 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgv49\" (UniqueName: \"kubernetes.io/projected/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-kube-api-access-hgv49\") pod \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.584297 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-node-mnt\") pod \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\" (UID: \"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b\") " Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.584502 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b" (UID: "f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.585647 4750 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.592290 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-kube-api-access-hgv49" (OuterVolumeSpecName: "kube-api-access-hgv49") pod "f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b" (UID: "f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b"). InnerVolumeSpecName "kube-api-access-hgv49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.611270 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b" (UID: "f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.687495 4750 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.687579 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgv49\" (UniqueName: \"kubernetes.io/projected/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b-kube-api-access-hgv49\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:44 crc kubenswrapper[4750]: I1008 19:26:44.743517 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:26:44 crc kubenswrapper[4750]: E1008 19:26:44.744134 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:26:45 crc kubenswrapper[4750]: I1008 19:26:45.121701 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2hwgr" event={"ID":"f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b","Type":"ContainerDied","Data":"3e93ea4c9f6b35095f25761cc8541aea04a66181b67e140e70eb1c3555f8367d"} Oct 08 19:26:45 crc kubenswrapper[4750]: I1008 19:26:45.122142 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e93ea4c9f6b35095f25761cc8541aea04a66181b67e140e70eb1c3555f8367d" Oct 08 19:26:45 crc kubenswrapper[4750]: I1008 19:26:45.121773 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2hwgr" Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.781064 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-2hwgr"] Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.787507 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-2hwgr"] Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.903976 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-8fpk7"] Oct 08 19:26:46 crc kubenswrapper[4750]: E1008 19:26:46.904616 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b" containerName="storage" Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.904650 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b" containerName="storage" Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.905125 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b" containerName="storage" Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.906654 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.909453 4750 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-nchbf" Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.910255 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.911769 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-8fpk7"] Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.913088 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 08 19:26:46 crc kubenswrapper[4750]: I1008 19:26:46.913105 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.038186 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d273d32-9f45-40d8-8174-f494f0f0014a-node-mnt\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.038509 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp89m\" (UniqueName: \"kubernetes.io/projected/9d273d32-9f45-40d8-8174-f494f0f0014a-kube-api-access-hp89m\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.038677 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d273d32-9f45-40d8-8174-f494f0f0014a-crc-storage\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.139796 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d273d32-9f45-40d8-8174-f494f0f0014a-crc-storage\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.139912 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d273d32-9f45-40d8-8174-f494f0f0014a-node-mnt\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.139957 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp89m\" (UniqueName: \"kubernetes.io/projected/9d273d32-9f45-40d8-8174-f494f0f0014a-kube-api-access-hp89m\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.140460 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d273d32-9f45-40d8-8174-f494f0f0014a-node-mnt\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.141398 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d273d32-9f45-40d8-8174-f494f0f0014a-crc-storage\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.173614 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp89m\" (UniqueName: \"kubernetes.io/projected/9d273d32-9f45-40d8-8174-f494f0f0014a-kube-api-access-hp89m\") pod \"crc-storage-crc-8fpk7\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.235658 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:47 crc kubenswrapper[4750]: I1008 19:26:47.733456 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-8fpk7"] Oct 08 19:26:48 crc kubenswrapper[4750]: I1008 19:26:48.159721 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-8fpk7" event={"ID":"9d273d32-9f45-40d8-8174-f494f0f0014a","Type":"ContainerStarted","Data":"fb08b041779e4b4fcb8da3107a3ac33354a5581b8b8bdb2d6661f6f7fbace3b3"} Oct 08 19:26:48 crc kubenswrapper[4750]: I1008 19:26:48.743866 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b" path="/var/lib/kubelet/pods/f713cbd0-74ca-4b4a-ac9b-7b33fac2dc6b/volumes" Oct 08 19:26:49 crc kubenswrapper[4750]: I1008 19:26:49.174072 4750 generic.go:334] "Generic (PLEG): container finished" podID="9d273d32-9f45-40d8-8174-f494f0f0014a" containerID="df7095160639f19468fe9637939f5c1fa08726b48d776435a38e46c0a63b013e" exitCode=0 Oct 08 19:26:49 crc kubenswrapper[4750]: I1008 19:26:49.174181 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-8fpk7" event={"ID":"9d273d32-9f45-40d8-8174-f494f0f0014a","Type":"ContainerDied","Data":"df7095160639f19468fe9637939f5c1fa08726b48d776435a38e46c0a63b013e"} Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.479205 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.603501 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d273d32-9f45-40d8-8174-f494f0f0014a-node-mnt\") pod \"9d273d32-9f45-40d8-8174-f494f0f0014a\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.603646 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp89m\" (UniqueName: \"kubernetes.io/projected/9d273d32-9f45-40d8-8174-f494f0f0014a-kube-api-access-hp89m\") pod \"9d273d32-9f45-40d8-8174-f494f0f0014a\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.603684 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d273d32-9f45-40d8-8174-f494f0f0014a-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9d273d32-9f45-40d8-8174-f494f0f0014a" (UID: "9d273d32-9f45-40d8-8174-f494f0f0014a"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.603758 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d273d32-9f45-40d8-8174-f494f0f0014a-crc-storage\") pod \"9d273d32-9f45-40d8-8174-f494f0f0014a\" (UID: \"9d273d32-9f45-40d8-8174-f494f0f0014a\") " Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.604259 4750 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d273d32-9f45-40d8-8174-f494f0f0014a-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.613277 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d273d32-9f45-40d8-8174-f494f0f0014a-kube-api-access-hp89m" (OuterVolumeSpecName: "kube-api-access-hp89m") pod "9d273d32-9f45-40d8-8174-f494f0f0014a" (UID: "9d273d32-9f45-40d8-8174-f494f0f0014a"). InnerVolumeSpecName "kube-api-access-hp89m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.631810 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d273d32-9f45-40d8-8174-f494f0f0014a-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9d273d32-9f45-40d8-8174-f494f0f0014a" (UID: "9d273d32-9f45-40d8-8174-f494f0f0014a"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.706014 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp89m\" (UniqueName: \"kubernetes.io/projected/9d273d32-9f45-40d8-8174-f494f0f0014a-kube-api-access-hp89m\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:50 crc kubenswrapper[4750]: I1008 19:26:50.706114 4750 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d273d32-9f45-40d8-8174-f494f0f0014a-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 08 19:26:51 crc kubenswrapper[4750]: I1008 19:26:51.202051 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-8fpk7" event={"ID":"9d273d32-9f45-40d8-8174-f494f0f0014a","Type":"ContainerDied","Data":"fb08b041779e4b4fcb8da3107a3ac33354a5581b8b8bdb2d6661f6f7fbace3b3"} Oct 08 19:26:51 crc kubenswrapper[4750]: I1008 19:26:51.202125 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb08b041779e4b4fcb8da3107a3ac33354a5581b8b8bdb2d6661f6f7fbace3b3" Oct 08 19:26:51 crc kubenswrapper[4750]: I1008 19:26:51.202155 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-8fpk7" Oct 08 19:26:55 crc kubenswrapper[4750]: I1008 19:26:55.734645 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:26:55 crc kubenswrapper[4750]: E1008 19:26:55.735385 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:27:02 crc kubenswrapper[4750]: I1008 19:27:02.549022 4750 scope.go:117] "RemoveContainer" containerID="a40408f8da3c343fa2267b44a9121e1dff8310915a7c556e894ca9bb049a70d0" Oct 08 19:27:07 crc kubenswrapper[4750]: I1008 19:27:07.734485 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:27:07 crc kubenswrapper[4750]: E1008 19:27:07.735993 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:27:18 crc kubenswrapper[4750]: I1008 19:27:18.734643 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:27:18 crc kubenswrapper[4750]: E1008 19:27:18.735616 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:27:31 crc kubenswrapper[4750]: I1008 19:27:31.735221 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:27:31 crc kubenswrapper[4750]: E1008 19:27:31.736086 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:27:43 crc kubenswrapper[4750]: I1008 19:27:43.735009 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:27:43 crc kubenswrapper[4750]: E1008 19:27:43.735824 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:27:58 crc kubenswrapper[4750]: I1008 19:27:58.734891 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:27:58 crc kubenswrapper[4750]: E1008 19:27:58.737866 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:28:10 crc kubenswrapper[4750]: I1008 19:28:10.734803 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:28:10 crc kubenswrapper[4750]: E1008 19:28:10.735635 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:28:25 crc kubenswrapper[4750]: I1008 19:28:25.735782 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:28:25 crc kubenswrapper[4750]: E1008 19:28:25.737278 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:28:38 crc kubenswrapper[4750]: I1008 19:28:38.734926 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:28:38 crc kubenswrapper[4750]: E1008 19:28:38.736063 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:28:50 crc kubenswrapper[4750]: I1008 19:28:50.735143 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:28:50 crc kubenswrapper[4750]: E1008 19:28:50.736393 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:29:03 crc kubenswrapper[4750]: I1008 19:29:03.734920 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:29:04 crc kubenswrapper[4750]: I1008 19:29:04.527989 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"1641173c95d4171a3b1be3458fb6da9f585cc8093883355dc63c1b1b9fca71ea"} Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.203077 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sl9xk"] Oct 08 19:29:37 crc kubenswrapper[4750]: E1008 19:29:37.204653 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d273d32-9f45-40d8-8174-f494f0f0014a" containerName="storage" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.204675 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d273d32-9f45-40d8-8174-f494f0f0014a" containerName="storage" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.204883 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d273d32-9f45-40d8-8174-f494f0f0014a" containerName="storage" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.209794 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.215809 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sl9xk"] Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.294590 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99cqh\" (UniqueName: \"kubernetes.io/projected/bf522436-a8e2-4904-99d5-4eedc86ece10-kube-api-access-99cqh\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.294816 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-catalog-content\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.295078 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-utilities\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.397256 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-catalog-content\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.397362 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-utilities\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.397420 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99cqh\" (UniqueName: \"kubernetes.io/projected/bf522436-a8e2-4904-99d5-4eedc86ece10-kube-api-access-99cqh\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.398432 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-catalog-content\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.398820 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-utilities\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.420180 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99cqh\" (UniqueName: \"kubernetes.io/projected/bf522436-a8e2-4904-99d5-4eedc86ece10-kube-api-access-99cqh\") pod \"certified-operators-sl9xk\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:37 crc kubenswrapper[4750]: I1008 19:29:37.533223 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:38 crc kubenswrapper[4750]: I1008 19:29:38.043442 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sl9xk"] Oct 08 19:29:38 crc kubenswrapper[4750]: I1008 19:29:38.845272 4750 generic.go:334] "Generic (PLEG): container finished" podID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerID="6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885" exitCode=0 Oct 08 19:29:38 crc kubenswrapper[4750]: I1008 19:29:38.845428 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl9xk" event={"ID":"bf522436-a8e2-4904-99d5-4eedc86ece10","Type":"ContainerDied","Data":"6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885"} Oct 08 19:29:38 crc kubenswrapper[4750]: I1008 19:29:38.846217 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl9xk" event={"ID":"bf522436-a8e2-4904-99d5-4eedc86ece10","Type":"ContainerStarted","Data":"0fc41c4349fb8f524950ebbd3b1ad44673e8133a48b15757a884f0f72b2366e5"} Oct 08 19:29:39 crc kubenswrapper[4750]: I1008 19:29:39.863125 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl9xk" event={"ID":"bf522436-a8e2-4904-99d5-4eedc86ece10","Type":"ContainerStarted","Data":"e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31"} Oct 08 19:29:40 crc kubenswrapper[4750]: I1008 19:29:40.877160 4750 generic.go:334] "Generic (PLEG): container finished" podID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerID="e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31" exitCode=0 Oct 08 19:29:40 crc kubenswrapper[4750]: I1008 19:29:40.877264 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl9xk" event={"ID":"bf522436-a8e2-4904-99d5-4eedc86ece10","Type":"ContainerDied","Data":"e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31"} Oct 08 19:29:41 crc kubenswrapper[4750]: I1008 19:29:41.892241 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl9xk" event={"ID":"bf522436-a8e2-4904-99d5-4eedc86ece10","Type":"ContainerStarted","Data":"ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1"} Oct 08 19:29:41 crc kubenswrapper[4750]: I1008 19:29:41.921213 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sl9xk" podStartSLOduration=2.481619022 podStartE2EDuration="4.921192237s" podCreationTimestamp="2025-10-08 19:29:37 +0000 UTC" firstStartedPulling="2025-10-08 19:29:38.849842646 +0000 UTC m=+4734.762813669" lastFinishedPulling="2025-10-08 19:29:41.289415871 +0000 UTC m=+4737.202386884" observedRunningTime="2025-10-08 19:29:41.915989836 +0000 UTC m=+4737.828960849" watchObservedRunningTime="2025-10-08 19:29:41.921192237 +0000 UTC m=+4737.834163250" Oct 08 19:29:47 crc kubenswrapper[4750]: I1008 19:29:47.534256 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:47 crc kubenswrapper[4750]: I1008 19:29:47.534775 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:47 crc kubenswrapper[4750]: I1008 19:29:47.598970 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:48 crc kubenswrapper[4750]: I1008 19:29:48.011943 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:48 crc kubenswrapper[4750]: I1008 19:29:48.068087 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sl9xk"] Oct 08 19:29:49 crc kubenswrapper[4750]: I1008 19:29:49.966517 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sl9xk" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerName="registry-server" containerID="cri-o://ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1" gracePeriod=2 Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.384952 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.510820 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-catalog-content\") pod \"bf522436-a8e2-4904-99d5-4eedc86ece10\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.510912 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99cqh\" (UniqueName: \"kubernetes.io/projected/bf522436-a8e2-4904-99d5-4eedc86ece10-kube-api-access-99cqh\") pod \"bf522436-a8e2-4904-99d5-4eedc86ece10\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.510958 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-utilities\") pod \"bf522436-a8e2-4904-99d5-4eedc86ece10\" (UID: \"bf522436-a8e2-4904-99d5-4eedc86ece10\") " Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.512260 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-utilities" (OuterVolumeSpecName: "utilities") pod "bf522436-a8e2-4904-99d5-4eedc86ece10" (UID: "bf522436-a8e2-4904-99d5-4eedc86ece10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.524009 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf522436-a8e2-4904-99d5-4eedc86ece10-kube-api-access-99cqh" (OuterVolumeSpecName: "kube-api-access-99cqh") pod "bf522436-a8e2-4904-99d5-4eedc86ece10" (UID: "bf522436-a8e2-4904-99d5-4eedc86ece10"). InnerVolumeSpecName "kube-api-access-99cqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.612973 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99cqh\" (UniqueName: \"kubernetes.io/projected/bf522436-a8e2-4904-99d5-4eedc86ece10-kube-api-access-99cqh\") on node \"crc\" DevicePath \"\"" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.613021 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.701116 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf522436-a8e2-4904-99d5-4eedc86ece10" (UID: "bf522436-a8e2-4904-99d5-4eedc86ece10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.715084 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf522436-a8e2-4904-99d5-4eedc86ece10-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.980886 4750 generic.go:334] "Generic (PLEG): container finished" podID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerID="ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1" exitCode=0 Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.980934 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl9xk" event={"ID":"bf522436-a8e2-4904-99d5-4eedc86ece10","Type":"ContainerDied","Data":"ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1"} Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.980993 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sl9xk" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.981272 4750 scope.go:117] "RemoveContainer" containerID="ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1" Oct 08 19:29:50 crc kubenswrapper[4750]: I1008 19:29:50.981254 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl9xk" event={"ID":"bf522436-a8e2-4904-99d5-4eedc86ece10","Type":"ContainerDied","Data":"0fc41c4349fb8f524950ebbd3b1ad44673e8133a48b15757a884f0f72b2366e5"} Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.017214 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sl9xk"] Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.022919 4750 scope.go:117] "RemoveContainer" containerID="e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31" Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.031397 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sl9xk"] Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.055786 4750 scope.go:117] "RemoveContainer" containerID="6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885" Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.083396 4750 scope.go:117] "RemoveContainer" containerID="ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1" Oct 08 19:29:51 crc kubenswrapper[4750]: E1008 19:29:51.084185 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1\": container with ID starting with ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1 not found: ID does not exist" containerID="ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1" Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.084270 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1"} err="failed to get container status \"ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1\": rpc error: code = NotFound desc = could not find container \"ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1\": container with ID starting with ffcbd5938f8ae41a4d432ba0edd20c9dec995469c3f0544ab1658ea63e1e52d1 not found: ID does not exist" Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.084318 4750 scope.go:117] "RemoveContainer" containerID="e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31" Oct 08 19:29:51 crc kubenswrapper[4750]: E1008 19:29:51.084806 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31\": container with ID starting with e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31 not found: ID does not exist" containerID="e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31" Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.084873 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31"} err="failed to get container status \"e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31\": rpc error: code = NotFound desc = could not find container \"e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31\": container with ID starting with e702e6391414025d461b44422bbb4bcf9e473cadf968c334e1588c292d74ed31 not found: ID does not exist" Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.084906 4750 scope.go:117] "RemoveContainer" containerID="6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885" Oct 08 19:29:51 crc kubenswrapper[4750]: E1008 19:29:51.085199 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885\": container with ID starting with 6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885 not found: ID does not exist" containerID="6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885" Oct 08 19:29:51 crc kubenswrapper[4750]: I1008 19:29:51.085224 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885"} err="failed to get container status \"6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885\": rpc error: code = NotFound desc = could not find container \"6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885\": container with ID starting with 6fe3c05e14a4fb595f8c47d5fc6317ced738c899172eae7c353ee4c994e67885 not found: ID does not exist" Oct 08 19:29:52 crc kubenswrapper[4750]: I1008 19:29:52.758819 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" path="/var/lib/kubelet/pods/bf522436-a8e2-4904-99d5-4eedc86ece10/volumes" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.133484 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz"] Oct 08 19:30:00 crc kubenswrapper[4750]: E1008 19:30:00.134758 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerName="extract-content" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.134780 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerName="extract-content" Oct 08 19:30:00 crc kubenswrapper[4750]: E1008 19:30:00.134802 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerName="registry-server" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.134810 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerName="registry-server" Oct 08 19:30:00 crc kubenswrapper[4750]: E1008 19:30:00.134837 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerName="extract-utilities" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.134846 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerName="extract-utilities" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.135031 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf522436-a8e2-4904-99d5-4eedc86ece10" containerName="registry-server" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.135832 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.140295 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.141126 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.148636 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz"] Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.178202 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wd2\" (UniqueName: \"kubernetes.io/projected/51e6f319-6b3e-4820-bcb4-3ed1fa911614-kube-api-access-s7wd2\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.178350 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51e6f319-6b3e-4820-bcb4-3ed1fa911614-secret-volume\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.178401 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51e6f319-6b3e-4820-bcb4-3ed1fa911614-config-volume\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.279326 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51e6f319-6b3e-4820-bcb4-3ed1fa911614-config-volume\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.279468 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wd2\" (UniqueName: \"kubernetes.io/projected/51e6f319-6b3e-4820-bcb4-3ed1fa911614-kube-api-access-s7wd2\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.279596 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51e6f319-6b3e-4820-bcb4-3ed1fa911614-secret-volume\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.280896 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51e6f319-6b3e-4820-bcb4-3ed1fa911614-config-volume\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.288111 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51e6f319-6b3e-4820-bcb4-3ed1fa911614-secret-volume\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.299505 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wd2\" (UniqueName: \"kubernetes.io/projected/51e6f319-6b3e-4820-bcb4-3ed1fa911614-kube-api-access-s7wd2\") pod \"collect-profiles-29332530-2hwwz\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:00 crc kubenswrapper[4750]: I1008 19:30:00.506515 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:01 crc kubenswrapper[4750]: I1008 19:30:01.001678 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz"] Oct 08 19:30:01 crc kubenswrapper[4750]: I1008 19:30:01.083065 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" event={"ID":"51e6f319-6b3e-4820-bcb4-3ed1fa911614","Type":"ContainerStarted","Data":"d47ed398a4efc1bb106df748ed945e4b46cac774cfd77cccf9d3c3d89aff14b5"} Oct 08 19:30:02 crc kubenswrapper[4750]: I1008 19:30:02.093135 4750 generic.go:334] "Generic (PLEG): container finished" podID="51e6f319-6b3e-4820-bcb4-3ed1fa911614" containerID="9fd3f91d8575161a55ceb19d33e987d8453521505dd4f1468dff79dc4f0e10dd" exitCode=0 Oct 08 19:30:02 crc kubenswrapper[4750]: I1008 19:30:02.093241 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" event={"ID":"51e6f319-6b3e-4820-bcb4-3ed1fa911614","Type":"ContainerDied","Data":"9fd3f91d8575161a55ceb19d33e987d8453521505dd4f1468dff79dc4f0e10dd"} Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.473403 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.547528 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51e6f319-6b3e-4820-bcb4-3ed1fa911614-config-volume\") pod \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.548019 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51e6f319-6b3e-4820-bcb4-3ed1fa911614-secret-volume\") pod \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.548062 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wd2\" (UniqueName: \"kubernetes.io/projected/51e6f319-6b3e-4820-bcb4-3ed1fa911614-kube-api-access-s7wd2\") pod \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\" (UID: \"51e6f319-6b3e-4820-bcb4-3ed1fa911614\") " Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.550003 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51e6f319-6b3e-4820-bcb4-3ed1fa911614-config-volume" (OuterVolumeSpecName: "config-volume") pod "51e6f319-6b3e-4820-bcb4-3ed1fa911614" (UID: "51e6f319-6b3e-4820-bcb4-3ed1fa911614"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.565882 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e6f319-6b3e-4820-bcb4-3ed1fa911614-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "51e6f319-6b3e-4820-bcb4-3ed1fa911614" (UID: "51e6f319-6b3e-4820-bcb4-3ed1fa911614"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.565932 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e6f319-6b3e-4820-bcb4-3ed1fa911614-kube-api-access-s7wd2" (OuterVolumeSpecName: "kube-api-access-s7wd2") pod "51e6f319-6b3e-4820-bcb4-3ed1fa911614" (UID: "51e6f319-6b3e-4820-bcb4-3ed1fa911614"). InnerVolumeSpecName "kube-api-access-s7wd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.649462 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51e6f319-6b3e-4820-bcb4-3ed1fa911614-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.649509 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51e6f319-6b3e-4820-bcb4-3ed1fa911614-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.649520 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wd2\" (UniqueName: \"kubernetes.io/projected/51e6f319-6b3e-4820-bcb4-3ed1fa911614-kube-api-access-s7wd2\") on node \"crc\" DevicePath \"\"" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.716212 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b7964457-qlsvg"] Oct 08 19:30:03 crc kubenswrapper[4750]: E1008 19:30:03.716810 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e6f319-6b3e-4820-bcb4-3ed1fa911614" containerName="collect-profiles" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.717394 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e6f319-6b3e-4820-bcb4-3ed1fa911614" containerName="collect-profiles" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.717640 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e6f319-6b3e-4820-bcb4-3ed1fa911614" containerName="collect-profiles" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.718986 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.723835 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.724206 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.723875 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.724077 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.724124 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-9tfpd" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.739307 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-qlsvg"] Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.854377 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-config\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.854437 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n95d\" (UniqueName: \"kubernetes.io/projected/0c065054-a32c-4a75-81fd-b6172078680e-kube-api-access-4n95d\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.854463 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-dns-svc\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.955853 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-config\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.955919 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n95d\" (UniqueName: \"kubernetes.io/projected/0c065054-a32c-4a75-81fd-b6172078680e-kube-api-access-4n95d\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.955955 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-dns-svc\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.957635 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-dns-svc\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.957779 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-config\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:03 crc kubenswrapper[4750]: I1008 19:30:03.996206 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n95d\" (UniqueName: \"kubernetes.io/projected/0c065054-a32c-4a75-81fd-b6172078680e-kube-api-access-4n95d\") pod \"dnsmasq-dns-8b7964457-qlsvg\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.028974 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-cwjtl"] Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.030256 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.045150 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.056189 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-cwjtl"] Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.123927 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" event={"ID":"51e6f319-6b3e-4820-bcb4-3ed1fa911614","Type":"ContainerDied","Data":"d47ed398a4efc1bb106df748ed945e4b46cac774cfd77cccf9d3c3d89aff14b5"} Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.123981 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d47ed398a4efc1bb106df748ed945e4b46cac774cfd77cccf9d3c3d89aff14b5" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.124057 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332530-2hwwz" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.159085 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-dns-svc\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.159172 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-config\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.159228 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgmvc\" (UniqueName: \"kubernetes.io/projected/5f0dd8be-3bb8-49fd-9369-531d32b127ee-kube-api-access-zgmvc\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.260866 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-config\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.260966 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgmvc\" (UniqueName: \"kubernetes.io/projected/5f0dd8be-3bb8-49fd-9369-531d32b127ee-kube-api-access-zgmvc\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.261066 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-dns-svc\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.262041 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-dns-svc\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.262075 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-config\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.280836 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgmvc\" (UniqueName: \"kubernetes.io/projected/5f0dd8be-3bb8-49fd-9369-531d32b127ee-kube-api-access-zgmvc\") pod \"dnsmasq-dns-67d9f7fb89-cwjtl\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.356701 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.547482 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-qlsvg"] Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.586757 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq"] Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.592311 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332485-9g9tq"] Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.752089 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8" path="/var/lib/kubelet/pods/6df48bfd-4f65-4ed8-92ef-0a3c4c4f6be8/volumes" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.833869 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-cwjtl"] Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.856445 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.858260 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.862763 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 19:30:04 crc kubenswrapper[4750]: W1008 19:30:04.866675 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f0dd8be_3bb8_49fd_9369_531d32b127ee.slice/crio-369d008d917d19a17a0187c6f90e2e9c9b607eee4d11dc90d4119ba425bc5006 WatchSource:0}: Error finding container 369d008d917d19a17a0187c6f90e2e9c9b607eee4d11dc90d4119ba425bc5006: Status 404 returned error can't find the container with id 369d008d917d19a17a0187c6f90e2e9c9b607eee4d11dc90d4119ba425bc5006 Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.869765 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.869856 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-k4pqw" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.870418 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.870640 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.890920 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.973541 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.973682 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.973863 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.973934 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.974001 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.974025 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.974077 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-545cj\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-kube-api-access-545cj\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.974105 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/365d4e67-6b18-4a2c-8547-6eda3058dcab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:04 crc kubenswrapper[4750]: I1008 19:30:04.974187 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/365d4e67-6b18-4a2c-8547-6eda3058dcab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.076705 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.076765 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.076798 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-545cj\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-kube-api-access-545cj\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.076838 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/365d4e67-6b18-4a2c-8547-6eda3058dcab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.076880 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/365d4e67-6b18-4a2c-8547-6eda3058dcab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.076919 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.076943 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.076994 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.077030 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.077929 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.078269 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.078390 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.079143 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.082639 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.082696 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9e2ad084b98242fe6b123829a7d0147f7766d5d7d1467ca61afb3d9648652ac0/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.092961 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/365d4e67-6b18-4a2c-8547-6eda3058dcab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.093027 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/365d4e67-6b18-4a2c-8547-6eda3058dcab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.093261 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.103528 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-545cj\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-kube-api-access-545cj\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.135222 4750 generic.go:334] "Generic (PLEG): container finished" podID="0c065054-a32c-4a75-81fd-b6172078680e" containerID="e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9" exitCode=0 Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.135360 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" event={"ID":"0c065054-a32c-4a75-81fd-b6172078680e","Type":"ContainerDied","Data":"e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9"} Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.135402 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" event={"ID":"0c065054-a32c-4a75-81fd-b6172078680e","Type":"ContainerStarted","Data":"b1e925bec15ff5e12a0fb95ae9309be9446f780529756e017b8cf16b059f7135"} Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.138973 4750 generic.go:334] "Generic (PLEG): container finished" podID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" containerID="0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea" exitCode=0 Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.139110 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" event={"ID":"5f0dd8be-3bb8-49fd-9369-531d32b127ee","Type":"ContainerDied","Data":"0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea"} Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.139150 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" event={"ID":"5f0dd8be-3bb8-49fd-9369-531d32b127ee","Type":"ContainerStarted","Data":"369d008d917d19a17a0187c6f90e2e9c9b607eee4d11dc90d4119ba425bc5006"} Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.145489 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"rabbitmq-cell1-server-0\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.178142 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.238494 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.240273 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.244959 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.245279 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.245491 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.245619 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-q94qm" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.245859 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.247265 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281629 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78303b30-ff92-45ac-8a55-4d80f6878e9b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281677 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281724 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281750 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281769 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s4vm\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-kube-api-access-4s4vm\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281791 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78303b30-ff92-45ac-8a55-4d80f6878e9b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281819 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281840 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.281865 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.383977 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.384422 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.384451 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s4vm\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-kube-api-access-4s4vm\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.384482 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78303b30-ff92-45ac-8a55-4d80f6878e9b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.384538 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.384580 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.384612 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.384672 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78303b30-ff92-45ac-8a55-4d80f6878e9b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.384694 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.385342 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.386771 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.387072 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.387348 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.397975 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.400149 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.400213 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f77bda10fe1591ba937c9cd19f213805df9569050f04fd63e1479b33d1f0a8cd/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.401934 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78303b30-ff92-45ac-8a55-4d80f6878e9b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.423008 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78303b30-ff92-45ac-8a55-4d80f6878e9b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.426267 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s4vm\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-kube-api-access-4s4vm\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.468699 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"rabbitmq-server-0\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.588105 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 19:30:05 crc kubenswrapper[4750]: I1008 19:30:05.700518 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.113290 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.116398 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.120306 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-pf5km" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.120602 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.120802 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.122329 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.122876 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.125091 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.130578 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.166424 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" event={"ID":"0c065054-a32c-4a75-81fd-b6172078680e","Type":"ContainerStarted","Data":"28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab"} Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.167037 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.172750 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" event={"ID":"5f0dd8be-3bb8-49fd-9369-531d32b127ee","Type":"ContainerStarted","Data":"b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a"} Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.173040 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.175291 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"365d4e67-6b18-4a2c-8547-6eda3058dcab","Type":"ContainerStarted","Data":"8467ed7c634129a9841e6c2dcd480b2a2d60ee647043dad616ceb3d6fead8472"} Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.187581 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" podStartSLOduration=3.187519367 podStartE2EDuration="3.187519367s" podCreationTimestamp="2025-10-08 19:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:30:06.185918026 +0000 UTC m=+4762.098889039" watchObservedRunningTime="2025-10-08 19:30:06.187519367 +0000 UTC m=+4762.100490380" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197080 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-config-data-default\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197147 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btnr6\" (UniqueName: \"kubernetes.io/projected/d258d002-5471-49c2-b43b-557992058385-kube-api-access-btnr6\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197179 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197407 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2d938e63-bc9e-49f8-85c5-af797b16929f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2d938e63-bc9e-49f8-85c5-af797b16929f\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197506 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d258d002-5471-49c2-b43b-557992058385-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197619 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-secrets\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197776 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197840 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.197924 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-kolla-config\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.204501 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" podStartSLOduration=3.204476513 podStartE2EDuration="3.204476513s" podCreationTimestamp="2025-10-08 19:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:30:06.202335678 +0000 UTC m=+4762.115306691" watchObservedRunningTime="2025-10-08 19:30:06.204476513 +0000 UTC m=+4762.117447526" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300191 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300285 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300350 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-kolla-config\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300425 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-config-data-default\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300458 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btnr6\" (UniqueName: \"kubernetes.io/projected/d258d002-5471-49c2-b43b-557992058385-kube-api-access-btnr6\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300521 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300590 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2d938e63-bc9e-49f8-85c5-af797b16929f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2d938e63-bc9e-49f8-85c5-af797b16929f\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300655 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d258d002-5471-49c2-b43b-557992058385-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.300687 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-secrets\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.302114 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d258d002-5471-49c2-b43b-557992058385-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.302294 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-kolla-config\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.303061 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-config-data-default\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.303165 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d258d002-5471-49c2-b43b-557992058385-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.306048 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-secrets\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.307811 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.307894 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2d938e63-bc9e-49f8-85c5-af797b16929f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2d938e63-bc9e-49f8-85c5-af797b16929f\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/58e19d1c5d90fceccd79cf0632996b0d29336a1dde3739668823d07c34ca7797/globalmount\"" pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.311296 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.321018 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d258d002-5471-49c2-b43b-557992058385-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.322933 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btnr6\" (UniqueName: \"kubernetes.io/projected/d258d002-5471-49c2-b43b-557992058385-kube-api-access-btnr6\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.355796 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2d938e63-bc9e-49f8-85c5-af797b16929f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2d938e63-bc9e-49f8-85c5-af797b16929f\") pod \"openstack-galera-0\" (UID: \"d258d002-5471-49c2-b43b-557992058385\") " pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.369126 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:30:06 crc kubenswrapper[4750]: W1008 19:30:06.385835 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78303b30_ff92_45ac_8a55_4d80f6878e9b.slice/crio-3b0efee5db5c8886185a76ff16f620e0b1d9a5ac006f85d6643babc474f1f044 WatchSource:0}: Error finding container 3b0efee5db5c8886185a76ff16f620e0b1d9a5ac006f85d6643babc474f1f044: Status 404 returned error can't find the container with id 3b0efee5db5c8886185a76ff16f620e0b1d9a5ac006f85d6643babc474f1f044 Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.455021 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.465462 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.467415 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.472037 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.476990 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xmgtq" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.494368 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.608879 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7mgx\" (UniqueName: \"kubernetes.io/projected/7454840f-78e3-41f6-a91b-2d34a96d5090-kube-api-access-l7mgx\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.609036 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7454840f-78e3-41f6-a91b-2d34a96d5090-kolla-config\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.609097 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7454840f-78e3-41f6-a91b-2d34a96d5090-config-data\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.710229 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7454840f-78e3-41f6-a91b-2d34a96d5090-kolla-config\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.710615 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7454840f-78e3-41f6-a91b-2d34a96d5090-config-data\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.710661 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7mgx\" (UniqueName: \"kubernetes.io/projected/7454840f-78e3-41f6-a91b-2d34a96d5090-kube-api-access-l7mgx\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.711829 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7454840f-78e3-41f6-a91b-2d34a96d5090-config-data\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.711835 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7454840f-78e3-41f6-a91b-2d34a96d5090-kolla-config\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.809051 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7mgx\" (UniqueName: \"kubernetes.io/projected/7454840f-78e3-41f6-a91b-2d34a96d5090-kube-api-access-l7mgx\") pod \"memcached-0\" (UID: \"7454840f-78e3-41f6-a91b-2d34a96d5090\") " pod="openstack/memcached-0" Oct 08 19:30:06 crc kubenswrapper[4750]: I1008 19:30:06.889052 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.002922 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 19:30:07 crc kubenswrapper[4750]: W1008 19:30:07.012780 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd258d002_5471_49c2_b43b_557992058385.slice/crio-19ae18350f84bf869616075cb26c3b9107e668e3b90d73dd3f514e4c102f448d WatchSource:0}: Error finding container 19ae18350f84bf869616075cb26c3b9107e668e3b90d73dd3f514e4c102f448d: Status 404 returned error can't find the container with id 19ae18350f84bf869616075cb26c3b9107e668e3b90d73dd3f514e4c102f448d Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.183151 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78303b30-ff92-45ac-8a55-4d80f6878e9b","Type":"ContainerStarted","Data":"3b0efee5db5c8886185a76ff16f620e0b1d9a5ac006f85d6643babc474f1f044"} Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.185411 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d258d002-5471-49c2-b43b-557992058385","Type":"ContainerStarted","Data":"19ae18350f84bf869616075cb26c3b9107e668e3b90d73dd3f514e4c102f448d"} Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.432251 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 19:30:07 crc kubenswrapper[4750]: W1008 19:30:07.439894 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7454840f_78e3_41f6_a91b_2d34a96d5090.slice/crio-46fddf0c8c717e8fd3ce2c75a5cd06364e84785d93bdf011b2573df47f2d7b2d WatchSource:0}: Error finding container 46fddf0c8c717e8fd3ce2c75a5cd06364e84785d93bdf011b2573df47f2d7b2d: Status 404 returned error can't find the container with id 46fddf0c8c717e8fd3ce2c75a5cd06364e84785d93bdf011b2573df47f2d7b2d Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.516456 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.521872 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.525442 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8j5kj" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.528213 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.528431 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.528517 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.530754 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628543 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628626 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c5bfd24e-39fa-45d1-868f-ad10e518c42a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5bfd24e-39fa-45d1-868f-ad10e518c42a\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628662 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628677 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628740 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16db7ca3-b357-4651-897e-1329b6b0b4d9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628847 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628887 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkmw4\" (UniqueName: \"kubernetes.io/projected/16db7ca3-b357-4651-897e-1329b6b0b4d9-kube-api-access-xkmw4\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628917 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.628990 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.730784 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16db7ca3-b357-4651-897e-1329b6b0b4d9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.730844 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.730865 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkmw4\" (UniqueName: \"kubernetes.io/projected/16db7ca3-b357-4651-897e-1329b6b0b4d9-kube-api-access-xkmw4\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.730883 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.731283 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16db7ca3-b357-4651-897e-1329b6b0b4d9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.732676 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.733308 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.733787 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.733925 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c5bfd24e-39fa-45d1-868f-ad10e518c42a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5bfd24e-39fa-45d1-868f-ad10e518c42a\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.733998 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.734022 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.734999 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.737071 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16db7ca3-b357-4651-897e-1329b6b0b4d9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.741433 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.741476 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c5bfd24e-39fa-45d1-868f-ad10e518c42a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5bfd24e-39fa-45d1-868f-ad10e518c42a\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0d0cc8385061acb59dc4551a39da32488fbed76cc77bdd388807e7fd9ce8ef25/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.741834 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.744089 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.747477 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/16db7ca3-b357-4651-897e-1329b6b0b4d9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.750009 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkmw4\" (UniqueName: \"kubernetes.io/projected/16db7ca3-b357-4651-897e-1329b6b0b4d9-kube-api-access-xkmw4\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.779645 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c5bfd24e-39fa-45d1-868f-ad10e518c42a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5bfd24e-39fa-45d1-868f-ad10e518c42a\") pod \"openstack-cell1-galera-0\" (UID: \"16db7ca3-b357-4651-897e-1329b6b0b4d9\") " pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:07 crc kubenswrapper[4750]: I1008 19:30:07.853303 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:08 crc kubenswrapper[4750]: I1008 19:30:08.195993 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7454840f-78e3-41f6-a91b-2d34a96d5090","Type":"ContainerStarted","Data":"98003608b9881342aab57c3d6b825132004c5020a424a2a87d152f8571d68dab"} Oct 08 19:30:08 crc kubenswrapper[4750]: I1008 19:30:08.196456 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 08 19:30:08 crc kubenswrapper[4750]: I1008 19:30:08.196474 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7454840f-78e3-41f6-a91b-2d34a96d5090","Type":"ContainerStarted","Data":"46fddf0c8c717e8fd3ce2c75a5cd06364e84785d93bdf011b2573df47f2d7b2d"} Oct 08 19:30:08 crc kubenswrapper[4750]: I1008 19:30:08.199437 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d258d002-5471-49c2-b43b-557992058385","Type":"ContainerStarted","Data":"56984fdcf578e51aa36e3f895d79392a909d4757662e9ac4b2af92df55c6ef77"} Oct 08 19:30:08 crc kubenswrapper[4750]: I1008 19:30:08.201428 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"365d4e67-6b18-4a2c-8547-6eda3058dcab","Type":"ContainerStarted","Data":"2bfc787935fd8f7e0ea0b24ac1843a291671e35e15a7cc410878c5c1413bcdd9"} Oct 08 19:30:08 crc kubenswrapper[4750]: I1008 19:30:08.203295 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78303b30-ff92-45ac-8a55-4d80f6878e9b","Type":"ContainerStarted","Data":"8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d"} Oct 08 19:30:08 crc kubenswrapper[4750]: I1008 19:30:08.231427 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.231388281 podStartE2EDuration="2.231388281s" podCreationTimestamp="2025-10-08 19:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:30:08.215503543 +0000 UTC m=+4764.128474576" watchObservedRunningTime="2025-10-08 19:30:08.231388281 +0000 UTC m=+4764.144359294" Oct 08 19:30:08 crc kubenswrapper[4750]: I1008 19:30:08.305271 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 19:30:08 crc kubenswrapper[4750]: W1008 19:30:08.313470 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16db7ca3_b357_4651_897e_1329b6b0b4d9.slice/crio-6e60654fe5d1118738bd8b237a34c4541f51f1b207d801d87d1e54dcf6199c1a WatchSource:0}: Error finding container 6e60654fe5d1118738bd8b237a34c4541f51f1b207d801d87d1e54dcf6199c1a: Status 404 returned error can't find the container with id 6e60654fe5d1118738bd8b237a34c4541f51f1b207d801d87d1e54dcf6199c1a Oct 08 19:30:09 crc kubenswrapper[4750]: I1008 19:30:09.214772 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16db7ca3-b357-4651-897e-1329b6b0b4d9","Type":"ContainerStarted","Data":"1777464d664cc1b89784d7a5e47303713585f69f7f79bc48942f69fd754cd33a"} Oct 08 19:30:09 crc kubenswrapper[4750]: I1008 19:30:09.215258 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16db7ca3-b357-4651-897e-1329b6b0b4d9","Type":"ContainerStarted","Data":"6e60654fe5d1118738bd8b237a34c4541f51f1b207d801d87d1e54dcf6199c1a"} Oct 08 19:30:11 crc kubenswrapper[4750]: I1008 19:30:11.240044 4750 generic.go:334] "Generic (PLEG): container finished" podID="d258d002-5471-49c2-b43b-557992058385" containerID="56984fdcf578e51aa36e3f895d79392a909d4757662e9ac4b2af92df55c6ef77" exitCode=0 Oct 08 19:30:11 crc kubenswrapper[4750]: I1008 19:30:11.240176 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d258d002-5471-49c2-b43b-557992058385","Type":"ContainerDied","Data":"56984fdcf578e51aa36e3f895d79392a909d4757662e9ac4b2af92df55c6ef77"} Oct 08 19:30:12 crc kubenswrapper[4750]: I1008 19:30:12.261059 4750 generic.go:334] "Generic (PLEG): container finished" podID="16db7ca3-b357-4651-897e-1329b6b0b4d9" containerID="1777464d664cc1b89784d7a5e47303713585f69f7f79bc48942f69fd754cd33a" exitCode=0 Oct 08 19:30:12 crc kubenswrapper[4750]: I1008 19:30:12.261254 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16db7ca3-b357-4651-897e-1329b6b0b4d9","Type":"ContainerDied","Data":"1777464d664cc1b89784d7a5e47303713585f69f7f79bc48942f69fd754cd33a"} Oct 08 19:30:12 crc kubenswrapper[4750]: I1008 19:30:12.266097 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d258d002-5471-49c2-b43b-557992058385","Type":"ContainerStarted","Data":"b23c0ca6d2237b4d93d04a3bcf18bbf2ad009d49889dea8252121ce9520d6f9a"} Oct 08 19:30:13 crc kubenswrapper[4750]: I1008 19:30:13.283765 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16db7ca3-b357-4651-897e-1329b6b0b4d9","Type":"ContainerStarted","Data":"85c417e92794610b6701d7c4c6301862c686984df5a1f71c1dd61054b3dfb790"} Oct 08 19:30:13 crc kubenswrapper[4750]: I1008 19:30:13.316766 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.316735247 podStartE2EDuration="7.316735247s" podCreationTimestamp="2025-10-08 19:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:30:13.314741868 +0000 UTC m=+4769.227712911" watchObservedRunningTime="2025-10-08 19:30:13.316735247 +0000 UTC m=+4769.229706270" Oct 08 19:30:13 crc kubenswrapper[4750]: I1008 19:30:13.318647 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.318637996 podStartE2EDuration="8.318637996s" podCreationTimestamp="2025-10-08 19:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:30:12.32307121 +0000 UTC m=+4768.236042243" watchObservedRunningTime="2025-10-08 19:30:13.318637996 +0000 UTC m=+4769.231609019" Oct 08 19:30:14 crc kubenswrapper[4750]: I1008 19:30:14.047767 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:14 crc kubenswrapper[4750]: I1008 19:30:14.358487 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:30:14 crc kubenswrapper[4750]: I1008 19:30:14.428847 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-qlsvg"] Oct 08 19:30:14 crc kubenswrapper[4750]: I1008 19:30:14.429482 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" podUID="0c065054-a32c-4a75-81fd-b6172078680e" containerName="dnsmasq-dns" containerID="cri-o://28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab" gracePeriod=10 Oct 08 19:30:14 crc kubenswrapper[4750]: I1008 19:30:14.896026 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.006087 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-dns-svc\") pod \"0c065054-a32c-4a75-81fd-b6172078680e\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.006629 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-config\") pod \"0c065054-a32c-4a75-81fd-b6172078680e\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.006834 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n95d\" (UniqueName: \"kubernetes.io/projected/0c065054-a32c-4a75-81fd-b6172078680e-kube-api-access-4n95d\") pod \"0c065054-a32c-4a75-81fd-b6172078680e\" (UID: \"0c065054-a32c-4a75-81fd-b6172078680e\") " Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.025695 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c065054-a32c-4a75-81fd-b6172078680e-kube-api-access-4n95d" (OuterVolumeSpecName: "kube-api-access-4n95d") pod "0c065054-a32c-4a75-81fd-b6172078680e" (UID: "0c065054-a32c-4a75-81fd-b6172078680e"). InnerVolumeSpecName "kube-api-access-4n95d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.048458 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c065054-a32c-4a75-81fd-b6172078680e" (UID: "0c065054-a32c-4a75-81fd-b6172078680e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.051286 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-config" (OuterVolumeSpecName: "config") pod "0c065054-a32c-4a75-81fd-b6172078680e" (UID: "0c065054-a32c-4a75-81fd-b6172078680e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.109438 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.109509 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c065054-a32c-4a75-81fd-b6172078680e-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.109581 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n95d\" (UniqueName: \"kubernetes.io/projected/0c065054-a32c-4a75-81fd-b6172078680e-kube-api-access-4n95d\") on node \"crc\" DevicePath \"\"" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.304084 4750 generic.go:334] "Generic (PLEG): container finished" podID="0c065054-a32c-4a75-81fd-b6172078680e" containerID="28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab" exitCode=0 Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.304156 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" event={"ID":"0c065054-a32c-4a75-81fd-b6172078680e","Type":"ContainerDied","Data":"28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab"} Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.304188 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.304216 4750 scope.go:117] "RemoveContainer" containerID="28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.304200 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7964457-qlsvg" event={"ID":"0c065054-a32c-4a75-81fd-b6172078680e","Type":"ContainerDied","Data":"b1e925bec15ff5e12a0fb95ae9309be9446f780529756e017b8cf16b059f7135"} Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.334474 4750 scope.go:117] "RemoveContainer" containerID="e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.345725 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-qlsvg"] Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.354893 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b7964457-qlsvg"] Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.365670 4750 scope.go:117] "RemoveContainer" containerID="28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab" Oct 08 19:30:15 crc kubenswrapper[4750]: E1008 19:30:15.366377 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab\": container with ID starting with 28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab not found: ID does not exist" containerID="28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.366412 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab"} err="failed to get container status \"28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab\": rpc error: code = NotFound desc = could not find container \"28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab\": container with ID starting with 28bb15e3ed813e20283a588d68e0e1391b9a804cac74ff4646f70705a677a6ab not found: ID does not exist" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.366439 4750 scope.go:117] "RemoveContainer" containerID="e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9" Oct 08 19:30:15 crc kubenswrapper[4750]: E1008 19:30:15.366914 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9\": container with ID starting with e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9 not found: ID does not exist" containerID="e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9" Oct 08 19:30:15 crc kubenswrapper[4750]: I1008 19:30:15.366954 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9"} err="failed to get container status \"e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9\": rpc error: code = NotFound desc = could not find container \"e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9\": container with ID starting with e409dcf56fe67fb37c9dee925799c2f2cd7cfa12d984c81d443235224761bde9 not found: ID does not exist" Oct 08 19:30:16 crc kubenswrapper[4750]: I1008 19:30:16.455485 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 08 19:30:16 crc kubenswrapper[4750]: I1008 19:30:16.455762 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 08 19:30:16 crc kubenswrapper[4750]: I1008 19:30:16.745960 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c065054-a32c-4a75-81fd-b6172078680e" path="/var/lib/kubelet/pods/0c065054-a32c-4a75-81fd-b6172078680e/volumes" Oct 08 19:30:16 crc kubenswrapper[4750]: I1008 19:30:16.891376 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 08 19:30:17 crc kubenswrapper[4750]: I1008 19:30:17.853484 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:17 crc kubenswrapper[4750]: I1008 19:30:17.854444 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:18 crc kubenswrapper[4750]: I1008 19:30:18.521278 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 08 19:30:18 crc kubenswrapper[4750]: I1008 19:30:18.592637 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 08 19:30:19 crc kubenswrapper[4750]: I1008 19:30:19.912062 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:19 crc kubenswrapper[4750]: I1008 19:30:19.977567 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 08 19:30:39 crc kubenswrapper[4750]: I1008 19:30:39.557609 4750 generic.go:334] "Generic (PLEG): container finished" podID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerID="2bfc787935fd8f7e0ea0b24ac1843a291671e35e15a7cc410878c5c1413bcdd9" exitCode=0 Oct 08 19:30:39 crc kubenswrapper[4750]: I1008 19:30:39.557742 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"365d4e67-6b18-4a2c-8547-6eda3058dcab","Type":"ContainerDied","Data":"2bfc787935fd8f7e0ea0b24ac1843a291671e35e15a7cc410878c5c1413bcdd9"} Oct 08 19:30:40 crc kubenswrapper[4750]: I1008 19:30:40.568757 4750 generic.go:334] "Generic (PLEG): container finished" podID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerID="8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d" exitCode=0 Oct 08 19:30:40 crc kubenswrapper[4750]: I1008 19:30:40.568927 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78303b30-ff92-45ac-8a55-4d80f6878e9b","Type":"ContainerDied","Data":"8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d"} Oct 08 19:30:40 crc kubenswrapper[4750]: I1008 19:30:40.574170 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"365d4e67-6b18-4a2c-8547-6eda3058dcab","Type":"ContainerStarted","Data":"1ffc89fa1333fe9f1136679d5e08a2d4823f47e491d666962a08d3dd7b578b6b"} Oct 08 19:30:40 crc kubenswrapper[4750]: I1008 19:30:40.574415 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:40 crc kubenswrapper[4750]: I1008 19:30:40.640398 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.640327832 podStartE2EDuration="37.640327832s" podCreationTimestamp="2025-10-08 19:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:30:40.639935133 +0000 UTC m=+4796.552906146" watchObservedRunningTime="2025-10-08 19:30:40.640327832 +0000 UTC m=+4796.553298845" Oct 08 19:30:41 crc kubenswrapper[4750]: I1008 19:30:41.583104 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78303b30-ff92-45ac-8a55-4d80f6878e9b","Type":"ContainerStarted","Data":"8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356"} Oct 08 19:30:41 crc kubenswrapper[4750]: I1008 19:30:41.583808 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 19:30:41 crc kubenswrapper[4750]: I1008 19:30:41.606071 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.606045719 podStartE2EDuration="37.606045719s" podCreationTimestamp="2025-10-08 19:30:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:30:41.604687575 +0000 UTC m=+4797.517658608" watchObservedRunningTime="2025-10-08 19:30:41.606045719 +0000 UTC m=+4797.519016732" Oct 08 19:30:55 crc kubenswrapper[4750]: I1008 19:30:55.181767 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:30:55 crc kubenswrapper[4750]: I1008 19:30:55.591575 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.743942 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-rj9w9"] Oct 08 19:30:59 crc kubenswrapper[4750]: E1008 19:30:59.744791 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c065054-a32c-4a75-81fd-b6172078680e" containerName="init" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.744809 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c065054-a32c-4a75-81fd-b6172078680e" containerName="init" Oct 08 19:30:59 crc kubenswrapper[4750]: E1008 19:30:59.744849 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c065054-a32c-4a75-81fd-b6172078680e" containerName="dnsmasq-dns" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.744858 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c065054-a32c-4a75-81fd-b6172078680e" containerName="dnsmasq-dns" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.745059 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c065054-a32c-4a75-81fd-b6172078680e" containerName="dnsmasq-dns" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.746209 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.769279 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-rj9w9"] Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.843731 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcnzh\" (UniqueName: \"kubernetes.io/projected/eb22965d-c55e-4c81-adf0-7a6f84e5494e-kube-api-access-tcnzh\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.843802 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-dns-svc\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.844110 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-config\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.946402 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnzh\" (UniqueName: \"kubernetes.io/projected/eb22965d-c55e-4c81-adf0-7a6f84e5494e-kube-api-access-tcnzh\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.946480 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-dns-svc\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.946568 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-config\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.947810 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-config\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.948088 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-dns-svc\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:30:59 crc kubenswrapper[4750]: I1008 19:30:59.968210 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcnzh\" (UniqueName: \"kubernetes.io/projected/eb22965d-c55e-4c81-adf0-7a6f84e5494e-kube-api-access-tcnzh\") pod \"dnsmasq-dns-5fdc957c47-rj9w9\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:31:00 crc kubenswrapper[4750]: I1008 19:31:00.069356 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:31:00 crc kubenswrapper[4750]: I1008 19:31:00.535248 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:31:00 crc kubenswrapper[4750]: I1008 19:31:00.611290 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-rj9w9"] Oct 08 19:31:00 crc kubenswrapper[4750]: I1008 19:31:00.754177 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" event={"ID":"eb22965d-c55e-4c81-adf0-7a6f84e5494e","Type":"ContainerStarted","Data":"e3710c239b3769ef6e9953edb8044fd80c2d0844e229981ef971518ee9b29103"} Oct 08 19:31:01 crc kubenswrapper[4750]: I1008 19:31:01.127418 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:31:01 crc kubenswrapper[4750]: I1008 19:31:01.762388 4750 generic.go:334] "Generic (PLEG): container finished" podID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" containerID="b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f" exitCode=0 Oct 08 19:31:01 crc kubenswrapper[4750]: I1008 19:31:01.762458 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" event={"ID":"eb22965d-c55e-4c81-adf0-7a6f84e5494e","Type":"ContainerDied","Data":"b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f"} Oct 08 19:31:02 crc kubenswrapper[4750]: I1008 19:31:02.699752 4750 scope.go:117] "RemoveContainer" containerID="84069d1173f634d8410ba0bb1baabff7ff57b42e929616e3aed04a6ebaf6aaf4" Oct 08 19:31:02 crc kubenswrapper[4750]: I1008 19:31:02.773949 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" event={"ID":"eb22965d-c55e-4c81-adf0-7a6f84e5494e","Type":"ContainerStarted","Data":"7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d"} Oct 08 19:31:02 crc kubenswrapper[4750]: I1008 19:31:02.774410 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:31:02 crc kubenswrapper[4750]: I1008 19:31:02.800966 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" podStartSLOduration=3.800938024 podStartE2EDuration="3.800938024s" podCreationTimestamp="2025-10-08 19:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:31:02.792661216 +0000 UTC m=+4818.705632239" watchObservedRunningTime="2025-10-08 19:31:02.800938024 +0000 UTC m=+4818.713909057" Oct 08 19:31:02 crc kubenswrapper[4750]: I1008 19:31:02.903040 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerName="rabbitmq" containerID="cri-o://8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356" gracePeriod=604798 Oct 08 19:31:03 crc kubenswrapper[4750]: I1008 19:31:03.442897 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerName="rabbitmq" containerID="cri-o://1ffc89fa1333fe9f1136679d5e08a2d4823f47e491d666962a08d3dd7b578b6b" gracePeriod=604798 Oct 08 19:31:05 crc kubenswrapper[4750]: I1008 19:31:05.179758 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.243:5672: connect: connection refused" Oct 08 19:31:05 crc kubenswrapper[4750]: I1008 19:31:05.597902 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.244:5672: connect: connection refused" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.767921 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.861173 4750 generic.go:334] "Generic (PLEG): container finished" podID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerID="1ffc89fa1333fe9f1136679d5e08a2d4823f47e491d666962a08d3dd7b578b6b" exitCode=0 Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.861251 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"365d4e67-6b18-4a2c-8547-6eda3058dcab","Type":"ContainerDied","Data":"1ffc89fa1333fe9f1136679d5e08a2d4823f47e491d666962a08d3dd7b578b6b"} Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.867581 4750 generic.go:334] "Generic (PLEG): container finished" podID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerID="8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356" exitCode=0 Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.867645 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78303b30-ff92-45ac-8a55-4d80f6878e9b","Type":"ContainerDied","Data":"8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356"} Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.867693 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78303b30-ff92-45ac-8a55-4d80f6878e9b","Type":"ContainerDied","Data":"3b0efee5db5c8886185a76ff16f620e0b1d9a5ac006f85d6643babc474f1f044"} Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.867719 4750 scope.go:117] "RemoveContainer" containerID="8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.867652 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.894673 4750 scope.go:117] "RemoveContainer" containerID="8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.922333 4750 scope.go:117] "RemoveContainer" containerID="8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356" Oct 08 19:31:10 crc kubenswrapper[4750]: E1008 19:31:09.926062 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356\": container with ID starting with 8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356 not found: ID does not exist" containerID="8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.926136 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356"} err="failed to get container status \"8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356\": rpc error: code = NotFound desc = could not find container \"8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356\": container with ID starting with 8ca6967af0c161515d7369c5955c488980421e792b3f0c263248d4711791b356 not found: ID does not exist" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.926176 4750 scope.go:117] "RemoveContainer" containerID="8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d" Oct 08 19:31:10 crc kubenswrapper[4750]: E1008 19:31:09.926574 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d\": container with ID starting with 8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d not found: ID does not exist" containerID="8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.926606 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d"} err="failed to get container status \"8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d\": rpc error: code = NotFound desc = could not find container \"8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d\": container with ID starting with 8e3a028b275d63d94a2c9d692a3357ccc705d0fc8da642c1151a2a696623668d not found: ID does not exist" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941055 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78303b30-ff92-45ac-8a55-4d80f6878e9b-pod-info\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941306 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941333 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-erlang-cookie\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941371 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s4vm\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-kube-api-access-4s4vm\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941404 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-plugins\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941460 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-server-conf\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941479 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78303b30-ff92-45ac-8a55-4d80f6878e9b-erlang-cookie-secret\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941588 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-confd\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941647 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-plugins-conf\") pod \"78303b30-ff92-45ac-8a55-4d80f6878e9b\" (UID: \"78303b30-ff92-45ac-8a55-4d80f6878e9b\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.941955 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.942151 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.942500 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.947540 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/78303b30-ff92-45ac-8a55-4d80f6878e9b-pod-info" (OuterVolumeSpecName: "pod-info") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.948436 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78303b30-ff92-45ac-8a55-4d80f6878e9b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.948610 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-kube-api-access-4s4vm" (OuterVolumeSpecName: "kube-api-access-4s4vm") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "kube-api-access-4s4vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.954428 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c" (OuterVolumeSpecName: "persistence") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "pvc-7e321cf3-b945-437d-a328-e9147c97102c". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:09.970828 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-server-conf" (OuterVolumeSpecName: "server-conf") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.037182 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "78303b30-ff92-45ac-8a55-4d80f6878e9b" (UID: "78303b30-ff92-45ac-8a55-4d80f6878e9b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043674 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s4vm\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-kube-api-access-4s4vm\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043720 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043734 4750 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043748 4750 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78303b30-ff92-45ac-8a55-4d80f6878e9b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043760 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043770 4750 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78303b30-ff92-45ac-8a55-4d80f6878e9b-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043781 4750 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78303b30-ff92-45ac-8a55-4d80f6878e9b-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043842 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") on node \"crc\" " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.043862 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78303b30-ff92-45ac-8a55-4d80f6878e9b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.063285 4750 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.063574 4750 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7e321cf3-b945-437d-a328-e9147c97102c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c") on node "crc" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.070823 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.125543 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-cwjtl"] Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.125835 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" podUID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" containerName="dnsmasq-dns" containerID="cri-o://b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a" gracePeriod=10 Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.146062 4750 reconciler_common.go:293] "Volume detached for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.260751 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.306863 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.321726 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:31:10 crc kubenswrapper[4750]: E1008 19:31:10.322887 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerName="rabbitmq" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.322917 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerName="rabbitmq" Oct 08 19:31:10 crc kubenswrapper[4750]: E1008 19:31:10.322959 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerName="setup-container" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.322968 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerName="setup-container" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.323434 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="78303b30-ff92-45ac-8a55-4d80f6878e9b" containerName="rabbitmq" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.338144 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.338397 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.342735 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-q94qm" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.342901 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.342989 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.342735 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.343763 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.391788 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460019 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-plugins-conf\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460079 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/365d4e67-6b18-4a2c-8547-6eda3058dcab-pod-info\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460302 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460369 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-plugins\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460416 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/365d4e67-6b18-4a2c-8547-6eda3058dcab-erlang-cookie-secret\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460451 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-erlang-cookie\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460518 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-545cj\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-kube-api-access-545cj\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460578 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-confd\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460598 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-server-conf\") pod \"365d4e67-6b18-4a2c-8547-6eda3058dcab\" (UID: \"365d4e67-6b18-4a2c-8547-6eda3058dcab\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460813 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460852 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjjsb\" (UniqueName: \"kubernetes.io/projected/6d34b3ab-fa13-494f-a40c-552c9e3e305b-kube-api-access-hjjsb\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460876 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d34b3ab-fa13-494f-a40c-552c9e3e305b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460916 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d34b3ab-fa13-494f-a40c-552c9e3e305b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.460964 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.461004 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.461027 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d34b3ab-fa13-494f-a40c-552c9e3e305b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.461050 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d34b3ab-fa13-494f-a40c-552c9e3e305b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.461085 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.467496 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.468012 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.468665 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.477588 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-kube-api-access-545cj" (OuterVolumeSpecName: "kube-api-access-545cj") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "kube-api-access-545cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.482313 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d4e67-6b18-4a2c-8547-6eda3058dcab-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.486125 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f" (OuterVolumeSpecName: "persistence") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "pvc-e953d341-e6e5-4c4b-b484-ddc33671479f". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.489708 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/365d4e67-6b18-4a2c-8547-6eda3058dcab-pod-info" (OuterVolumeSpecName: "pod-info") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.499764 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-server-conf" (OuterVolumeSpecName: "server-conf") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562463 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562585 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562614 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d34b3ab-fa13-494f-a40c-552c9e3e305b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562640 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d34b3ab-fa13-494f-a40c-552c9e3e305b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562671 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562695 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562716 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjjsb\" (UniqueName: \"kubernetes.io/projected/6d34b3ab-fa13-494f-a40c-552c9e3e305b-kube-api-access-hjjsb\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562734 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d34b3ab-fa13-494f-a40c-552c9e3e305b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562762 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d34b3ab-fa13-494f-a40c-552c9e3e305b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562817 4750 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/365d4e67-6b18-4a2c-8547-6eda3058dcab-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562842 4750 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") on node \"crc\" " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562854 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562867 4750 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/365d4e67-6b18-4a2c-8547-6eda3058dcab-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562878 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562891 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-545cj\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-kube-api-access-545cj\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562901 4750 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.562910 4750 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/365d4e67-6b18-4a2c-8547-6eda3058dcab-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.563962 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.565458 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.566030 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d34b3ab-fa13-494f-a40c-552c9e3e305b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.566145 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d34b3ab-fa13-494f-a40c-552c9e3e305b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.567405 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d34b3ab-fa13-494f-a40c-552c9e3e305b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.567920 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d34b3ab-fa13-494f-a40c-552c9e3e305b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.569345 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.569382 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f77bda10fe1591ba937c9cd19f213805df9569050f04fd63e1479b33d1f0a8cd/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.569472 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d34b3ab-fa13-494f-a40c-552c9e3e305b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.581989 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "365d4e67-6b18-4a2c-8547-6eda3058dcab" (UID: "365d4e67-6b18-4a2c-8547-6eda3058dcab"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.586510 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjjsb\" (UniqueName: \"kubernetes.io/projected/6d34b3ab-fa13-494f-a40c-552c9e3e305b-kube-api-access-hjjsb\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.592224 4750 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.592378 4750 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e953d341-e6e5-4c4b-b484-ddc33671479f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f") on node "crc" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.615972 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7e321cf3-b945-437d-a328-e9147c97102c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e321cf3-b945-437d-a328-e9147c97102c\") pod \"rabbitmq-server-0\" (UID: \"6d34b3ab-fa13-494f-a40c-552c9e3e305b\") " pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.658775 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.665025 4750 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/365d4e67-6b18-4a2c-8547-6eda3058dcab-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.665058 4750 reconciler_common.go:293] "Volume detached for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.715131 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.748631 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78303b30-ff92-45ac-8a55-4d80f6878e9b" path="/var/lib/kubelet/pods/78303b30-ff92-45ac-8a55-4d80f6878e9b/volumes" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.766909 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-dns-svc\") pod \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.767008 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgmvc\" (UniqueName: \"kubernetes.io/projected/5f0dd8be-3bb8-49fd-9369-531d32b127ee-kube-api-access-zgmvc\") pod \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.767207 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-config\") pod \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\" (UID: \"5f0dd8be-3bb8-49fd-9369-531d32b127ee\") " Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.773535 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0dd8be-3bb8-49fd-9369-531d32b127ee-kube-api-access-zgmvc" (OuterVolumeSpecName: "kube-api-access-zgmvc") pod "5f0dd8be-3bb8-49fd-9369-531d32b127ee" (UID: "5f0dd8be-3bb8-49fd-9369-531d32b127ee"). InnerVolumeSpecName "kube-api-access-zgmvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.815271 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f0dd8be-3bb8-49fd-9369-531d32b127ee" (UID: "5f0dd8be-3bb8-49fd-9369-531d32b127ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.821312 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-config" (OuterVolumeSpecName: "config") pod "5f0dd8be-3bb8-49fd-9369-531d32b127ee" (UID: "5f0dd8be-3bb8-49fd-9369-531d32b127ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.870258 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.870287 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgmvc\" (UniqueName: \"kubernetes.io/projected/5f0dd8be-3bb8-49fd-9369-531d32b127ee-kube-api-access-zgmvc\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.870324 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0dd8be-3bb8-49fd-9369-531d32b127ee-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.879427 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"365d4e67-6b18-4a2c-8547-6eda3058dcab","Type":"ContainerDied","Data":"8467ed7c634129a9841e6c2dcd480b2a2d60ee647043dad616ceb3d6fead8472"} Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.879484 4750 scope.go:117] "RemoveContainer" containerID="1ffc89fa1333fe9f1136679d5e08a2d4823f47e491d666962a08d3dd7b578b6b" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.879648 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.887365 4750 generic.go:334] "Generic (PLEG): container finished" podID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" containerID="b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a" exitCode=0 Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.887434 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" event={"ID":"5f0dd8be-3bb8-49fd-9369-531d32b127ee","Type":"ContainerDied","Data":"b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a"} Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.887478 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" event={"ID":"5f0dd8be-3bb8-49fd-9369-531d32b127ee","Type":"ContainerDied","Data":"369d008d917d19a17a0187c6f90e2e9c9b607eee4d11dc90d4119ba425bc5006"} Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.887596 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d9f7fb89-cwjtl" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.922414 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.925199 4750 scope.go:117] "RemoveContainer" containerID="2bfc787935fd8f7e0ea0b24ac1843a291671e35e15a7cc410878c5c1413bcdd9" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.948574 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.965105 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:31:10 crc kubenswrapper[4750]: E1008 19:31:10.965466 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" containerName="init" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.965479 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" containerName="init" Oct 08 19:31:10 crc kubenswrapper[4750]: E1008 19:31:10.965496 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerName="rabbitmq" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.965502 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerName="rabbitmq" Oct 08 19:31:10 crc kubenswrapper[4750]: E1008 19:31:10.965519 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" containerName="dnsmasq-dns" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.965526 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" containerName="dnsmasq-dns" Oct 08 19:31:10 crc kubenswrapper[4750]: E1008 19:31:10.965568 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerName="setup-container" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.965575 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerName="setup-container" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.965728 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" containerName="dnsmasq-dns" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.965744 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="365d4e67-6b18-4a2c-8547-6eda3058dcab" containerName="rabbitmq" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.966737 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.970580 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.970869 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-k4pqw" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.970990 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.971124 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.971315 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.974726 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-cwjtl"] Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.985744 4750 scope.go:117] "RemoveContainer" containerID="b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a" Oct 08 19:31:10 crc kubenswrapper[4750]: I1008 19:31:10.991559 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67d9f7fb89-cwjtl"] Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.002325 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.032529 4750 scope.go:117] "RemoveContainer" containerID="0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.053578 4750 scope.go:117] "RemoveContainer" containerID="b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a" Oct 08 19:31:11 crc kubenswrapper[4750]: E1008 19:31:11.054103 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a\": container with ID starting with b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a not found: ID does not exist" containerID="b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.054204 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a"} err="failed to get container status \"b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a\": rpc error: code = NotFound desc = could not find container \"b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a\": container with ID starting with b9ee3e82e7e3c0d9b4ee8365fe6176b5fbdebf027e57d999659e0273b378280a not found: ID does not exist" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.054261 4750 scope.go:117] "RemoveContainer" containerID="0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea" Oct 08 19:31:11 crc kubenswrapper[4750]: E1008 19:31:11.054672 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea\": container with ID starting with 0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea not found: ID does not exist" containerID="0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.054727 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea"} err="failed to get container status \"0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea\": rpc error: code = NotFound desc = could not find container \"0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea\": container with ID starting with 0e4b59236424900e84370a42f23c23f42468cb607078c329f694512529598dea not found: ID does not exist" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072275 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072336 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3580a244-3fbc-4341-b898-69cd465a21a3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072373 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072396 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qmsk\" (UniqueName: \"kubernetes.io/projected/3580a244-3fbc-4341-b898-69cd465a21a3-kube-api-access-8qmsk\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072441 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3580a244-3fbc-4341-b898-69cd465a21a3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072472 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3580a244-3fbc-4341-b898-69cd465a21a3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072501 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3580a244-3fbc-4341-b898-69cd465a21a3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072531 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.072602 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174225 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174287 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174324 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3580a244-3fbc-4341-b898-69cd465a21a3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174366 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174391 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qmsk\" (UniqueName: \"kubernetes.io/projected/3580a244-3fbc-4341-b898-69cd465a21a3-kube-api-access-8qmsk\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174438 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3580a244-3fbc-4341-b898-69cd465a21a3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174469 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3580a244-3fbc-4341-b898-69cd465a21a3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174505 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3580a244-3fbc-4341-b898-69cd465a21a3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.174543 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.175200 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.176378 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.176709 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3580a244-3fbc-4341-b898-69cd465a21a3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.176727 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3580a244-3fbc-4341-b898-69cd465a21a3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.178956 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.179036 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9e2ad084b98242fe6b123829a7d0147f7766d5d7d1467ca61afb3d9648652ac0/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.180687 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.181017 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3580a244-3fbc-4341-b898-69cd465a21a3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.181241 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3580a244-3fbc-4341-b898-69cd465a21a3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.182357 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3580a244-3fbc-4341-b898-69cd465a21a3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.205861 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qmsk\" (UniqueName: \"kubernetes.io/projected/3580a244-3fbc-4341-b898-69cd465a21a3-kube-api-access-8qmsk\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.227881 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e953d341-e6e5-4c4b-b484-ddc33671479f\") pod \"rabbitmq-cell1-server-0\" (UID: \"3580a244-3fbc-4341-b898-69cd465a21a3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.301955 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.576800 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.921263 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3580a244-3fbc-4341-b898-69cd465a21a3","Type":"ContainerStarted","Data":"006e11f3aaa364f27f5cc11c78c83c2765178d4d534e86ca645b9f4deab362ae"} Oct 08 19:31:11 crc kubenswrapper[4750]: I1008 19:31:11.926225 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d34b3ab-fa13-494f-a40c-552c9e3e305b","Type":"ContainerStarted","Data":"9748b2a4e4cf27a020bc67f587b8689b965bdf40b00fba3e540b7d51363ff59d"} Oct 08 19:31:12 crc kubenswrapper[4750]: I1008 19:31:12.752081 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365d4e67-6b18-4a2c-8547-6eda3058dcab" path="/var/lib/kubelet/pods/365d4e67-6b18-4a2c-8547-6eda3058dcab/volumes" Oct 08 19:31:12 crc kubenswrapper[4750]: I1008 19:31:12.753899 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0dd8be-3bb8-49fd-9369-531d32b127ee" path="/var/lib/kubelet/pods/5f0dd8be-3bb8-49fd-9369-531d32b127ee/volumes" Oct 08 19:31:12 crc kubenswrapper[4750]: I1008 19:31:12.944866 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d34b3ab-fa13-494f-a40c-552c9e3e305b","Type":"ContainerStarted","Data":"e97abf7545269da89dbe8fbdb17809a28e0eb799bef154dcf77d566f829a95d8"} Oct 08 19:31:13 crc kubenswrapper[4750]: I1008 19:31:13.954050 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3580a244-3fbc-4341-b898-69cd465a21a3","Type":"ContainerStarted","Data":"707a18acef485dd6ffab86409d815d90a8c7e1996430d66cf055762f30f7f271"} Oct 08 19:31:29 crc kubenswrapper[4750]: I1008 19:31:29.707102 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:31:29 crc kubenswrapper[4750]: I1008 19:31:29.707968 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:31:46 crc kubenswrapper[4750]: I1008 19:31:46.292638 4750 generic.go:334] "Generic (PLEG): container finished" podID="3580a244-3fbc-4341-b898-69cd465a21a3" containerID="707a18acef485dd6ffab86409d815d90a8c7e1996430d66cf055762f30f7f271" exitCode=0 Oct 08 19:31:46 crc kubenswrapper[4750]: I1008 19:31:46.292733 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3580a244-3fbc-4341-b898-69cd465a21a3","Type":"ContainerDied","Data":"707a18acef485dd6ffab86409d815d90a8c7e1996430d66cf055762f30f7f271"} Oct 08 19:31:46 crc kubenswrapper[4750]: I1008 19:31:46.295984 4750 generic.go:334] "Generic (PLEG): container finished" podID="6d34b3ab-fa13-494f-a40c-552c9e3e305b" containerID="e97abf7545269da89dbe8fbdb17809a28e0eb799bef154dcf77d566f829a95d8" exitCode=0 Oct 08 19:31:46 crc kubenswrapper[4750]: I1008 19:31:46.296016 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d34b3ab-fa13-494f-a40c-552c9e3e305b","Type":"ContainerDied","Data":"e97abf7545269da89dbe8fbdb17809a28e0eb799bef154dcf77d566f829a95d8"} Oct 08 19:31:47 crc kubenswrapper[4750]: I1008 19:31:47.306897 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d34b3ab-fa13-494f-a40c-552c9e3e305b","Type":"ContainerStarted","Data":"94a84939da8f90a4a6490cce3b3187e63b46dd824fef665a3d8917aa4c162e23"} Oct 08 19:31:47 crc kubenswrapper[4750]: I1008 19:31:47.308030 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 19:31:47 crc kubenswrapper[4750]: I1008 19:31:47.309486 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3580a244-3fbc-4341-b898-69cd465a21a3","Type":"ContainerStarted","Data":"8e46af1b6c7602fd8ef08985e62e2ba3865fa479d2cc6d4a95098e8eaf25d1b2"} Oct 08 19:31:47 crc kubenswrapper[4750]: I1008 19:31:47.309781 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:31:47 crc kubenswrapper[4750]: I1008 19:31:47.339781 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.339757327 podStartE2EDuration="37.339757327s" podCreationTimestamp="2025-10-08 19:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:31:47.330435873 +0000 UTC m=+4863.243406886" watchObservedRunningTime="2025-10-08 19:31:47.339757327 +0000 UTC m=+4863.252728340" Oct 08 19:31:47 crc kubenswrapper[4750]: I1008 19:31:47.360708 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.360678632 podStartE2EDuration="37.360678632s" podCreationTimestamp="2025-10-08 19:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:31:47.353273826 +0000 UTC m=+4863.266244849" watchObservedRunningTime="2025-10-08 19:31:47.360678632 +0000 UTC m=+4863.273649645" Oct 08 19:31:59 crc kubenswrapper[4750]: I1008 19:31:59.707096 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:31:59 crc kubenswrapper[4750]: I1008 19:31:59.708281 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:32:00 crc kubenswrapper[4750]: I1008 19:32:00.720053 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 19:32:01 crc kubenswrapper[4750]: I1008 19:32:01.304759 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 19:32:12 crc kubenswrapper[4750]: I1008 19:32:12.790038 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 19:32:12 crc kubenswrapper[4750]: I1008 19:32:12.792167 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 08 19:32:12 crc kubenswrapper[4750]: I1008 19:32:12.795632 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5p88g" Oct 08 19:32:12 crc kubenswrapper[4750]: I1008 19:32:12.802779 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 19:32:12 crc kubenswrapper[4750]: I1008 19:32:12.904231 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thkk6\" (UniqueName: \"kubernetes.io/projected/4fd6c9ac-fad3-46a2-9494-a1f24467f22d-kube-api-access-thkk6\") pod \"mariadb-client-1-default\" (UID: \"4fd6c9ac-fad3-46a2-9494-a1f24467f22d\") " pod="openstack/mariadb-client-1-default" Oct 08 19:32:13 crc kubenswrapper[4750]: I1008 19:32:13.006854 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thkk6\" (UniqueName: \"kubernetes.io/projected/4fd6c9ac-fad3-46a2-9494-a1f24467f22d-kube-api-access-thkk6\") pod \"mariadb-client-1-default\" (UID: \"4fd6c9ac-fad3-46a2-9494-a1f24467f22d\") " pod="openstack/mariadb-client-1-default" Oct 08 19:32:13 crc kubenswrapper[4750]: I1008 19:32:13.113281 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thkk6\" (UniqueName: \"kubernetes.io/projected/4fd6c9ac-fad3-46a2-9494-a1f24467f22d-kube-api-access-thkk6\") pod \"mariadb-client-1-default\" (UID: \"4fd6c9ac-fad3-46a2-9494-a1f24467f22d\") " pod="openstack/mariadb-client-1-default" Oct 08 19:32:13 crc kubenswrapper[4750]: I1008 19:32:13.121371 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 08 19:32:13 crc kubenswrapper[4750]: I1008 19:32:13.760087 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 19:32:13 crc kubenswrapper[4750]: I1008 19:32:13.762109 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:32:14 crc kubenswrapper[4750]: I1008 19:32:14.562057 4750 generic.go:334] "Generic (PLEG): container finished" podID="4fd6c9ac-fad3-46a2-9494-a1f24467f22d" containerID="1abc6208ea85b1a98af8b03ed8c49e3e7f2dd09c3b0768fe75f381dd3bf88c93" exitCode=0 Oct 08 19:32:14 crc kubenswrapper[4750]: I1008 19:32:14.562142 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"4fd6c9ac-fad3-46a2-9494-a1f24467f22d","Type":"ContainerDied","Data":"1abc6208ea85b1a98af8b03ed8c49e3e7f2dd09c3b0768fe75f381dd3bf88c93"} Oct 08 19:32:14 crc kubenswrapper[4750]: I1008 19:32:14.562842 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"4fd6c9ac-fad3-46a2-9494-a1f24467f22d","Type":"ContainerStarted","Data":"a70550c2da771b745c68742eecb886771e20f610c6470d0df6815a6aa6ce5cff"} Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.029415 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.063272 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_4fd6c9ac-fad3-46a2-9494-a1f24467f22d/mariadb-client-1-default/0.log" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.102255 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.109670 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.171078 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thkk6\" (UniqueName: \"kubernetes.io/projected/4fd6c9ac-fad3-46a2-9494-a1f24467f22d-kube-api-access-thkk6\") pod \"4fd6c9ac-fad3-46a2-9494-a1f24467f22d\" (UID: \"4fd6c9ac-fad3-46a2-9494-a1f24467f22d\") " Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.183400 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd6c9ac-fad3-46a2-9494-a1f24467f22d-kube-api-access-thkk6" (OuterVolumeSpecName: "kube-api-access-thkk6") pod "4fd6c9ac-fad3-46a2-9494-a1f24467f22d" (UID: "4fd6c9ac-fad3-46a2-9494-a1f24467f22d"). InnerVolumeSpecName "kube-api-access-thkk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.272884 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thkk6\" (UniqueName: \"kubernetes.io/projected/4fd6c9ac-fad3-46a2-9494-a1f24467f22d-kube-api-access-thkk6\") on node \"crc\" DevicePath \"\"" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.582936 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a70550c2da771b745c68742eecb886771e20f610c6470d0df6815a6aa6ce5cff" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.583034 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.607833 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 19:32:16 crc kubenswrapper[4750]: E1008 19:32:16.608247 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd6c9ac-fad3-46a2-9494-a1f24467f22d" containerName="mariadb-client-1-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.608274 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd6c9ac-fad3-46a2-9494-a1f24467f22d" containerName="mariadb-client-1-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.608531 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd6c9ac-fad3-46a2-9494-a1f24467f22d" containerName="mariadb-client-1-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.609298 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.613230 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5p88g" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.618493 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.748787 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd6c9ac-fad3-46a2-9494-a1f24467f22d" path="/var/lib/kubelet/pods/4fd6c9ac-fad3-46a2-9494-a1f24467f22d/volumes" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.782163 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thp96\" (UniqueName: \"kubernetes.io/projected/6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd-kube-api-access-thp96\") pod \"mariadb-client-2-default\" (UID: \"6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd\") " pod="openstack/mariadb-client-2-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.884773 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thp96\" (UniqueName: \"kubernetes.io/projected/6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd-kube-api-access-thp96\") pod \"mariadb-client-2-default\" (UID: \"6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd\") " pod="openstack/mariadb-client-2-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.903470 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thp96\" (UniqueName: \"kubernetes.io/projected/6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd-kube-api-access-thp96\") pod \"mariadb-client-2-default\" (UID: \"6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd\") " pod="openstack/mariadb-client-2-default" Oct 08 19:32:16 crc kubenswrapper[4750]: I1008 19:32:16.926666 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 08 19:32:17 crc kubenswrapper[4750]: I1008 19:32:17.486240 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 19:32:17 crc kubenswrapper[4750]: W1008 19:32:17.498772 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8fb18f_204e_4dc4_b1b7_ee993a60fbfd.slice/crio-0ddbcbc70b0ae525dab9a6cbd6b6a58040afb1ac500ad36eabe07949999e1987 WatchSource:0}: Error finding container 0ddbcbc70b0ae525dab9a6cbd6b6a58040afb1ac500ad36eabe07949999e1987: Status 404 returned error can't find the container with id 0ddbcbc70b0ae525dab9a6cbd6b6a58040afb1ac500ad36eabe07949999e1987 Oct 08 19:32:17 crc kubenswrapper[4750]: I1008 19:32:17.606104 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd","Type":"ContainerStarted","Data":"0ddbcbc70b0ae525dab9a6cbd6b6a58040afb1ac500ad36eabe07949999e1987"} Oct 08 19:32:18 crc kubenswrapper[4750]: I1008 19:32:18.638001 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd","Type":"ContainerStarted","Data":"456ebeefa91a765a33447733bee565b1f5a98f5688cb0bba3ae1755f1dc58368"} Oct 08 19:32:18 crc kubenswrapper[4750]: I1008 19:32:18.663515 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=2.6634875239999998 podStartE2EDuration="2.663487524s" podCreationTimestamp="2025-10-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:32:18.660193011 +0000 UTC m=+4894.573164034" watchObservedRunningTime="2025-10-08 19:32:18.663487524 +0000 UTC m=+4894.576458537" Oct 08 19:32:19 crc kubenswrapper[4750]: I1008 19:32:19.654662 4750 generic.go:334] "Generic (PLEG): container finished" podID="6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd" containerID="456ebeefa91a765a33447733bee565b1f5a98f5688cb0bba3ae1755f1dc58368" exitCode=0 Oct 08 19:32:19 crc kubenswrapper[4750]: I1008 19:32:19.654750 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd","Type":"ContainerDied","Data":"456ebeefa91a765a33447733bee565b1f5a98f5688cb0bba3ae1755f1dc58368"} Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.061067 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.110055 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.117689 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.162303 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thp96\" (UniqueName: \"kubernetes.io/projected/6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd-kube-api-access-thp96\") pod \"6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd\" (UID: \"6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd\") " Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.169707 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd-kube-api-access-thp96" (OuterVolumeSpecName: "kube-api-access-thp96") pod "6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd" (UID: "6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd"). InnerVolumeSpecName "kube-api-access-thp96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.265472 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thp96\" (UniqueName: \"kubernetes.io/projected/6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd-kube-api-access-thp96\") on node \"crc\" DevicePath \"\"" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.522814 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 08 19:32:21 crc kubenswrapper[4750]: E1008 19:32:21.523414 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd" containerName="mariadb-client-2-default" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.523441 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd" containerName="mariadb-client-2-default" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.523705 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd" containerName="mariadb-client-2-default" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.525219 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.536434 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.671415 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvfcn\" (UniqueName: \"kubernetes.io/projected/fd7bcb8c-0d88-45a3-8118-6c1cf36023a9-kube-api-access-fvfcn\") pod \"mariadb-client-1\" (UID: \"fd7bcb8c-0d88-45a3-8118-6c1cf36023a9\") " pod="openstack/mariadb-client-1" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.675366 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ddbcbc70b0ae525dab9a6cbd6b6a58040afb1ac500ad36eabe07949999e1987" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.675472 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.774065 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvfcn\" (UniqueName: \"kubernetes.io/projected/fd7bcb8c-0d88-45a3-8118-6c1cf36023a9-kube-api-access-fvfcn\") pod \"mariadb-client-1\" (UID: \"fd7bcb8c-0d88-45a3-8118-6c1cf36023a9\") " pod="openstack/mariadb-client-1" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.797893 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvfcn\" (UniqueName: \"kubernetes.io/projected/fd7bcb8c-0d88-45a3-8118-6c1cf36023a9-kube-api-access-fvfcn\") pod \"mariadb-client-1\" (UID: \"fd7bcb8c-0d88-45a3-8118-6c1cf36023a9\") " pod="openstack/mariadb-client-1" Oct 08 19:32:21 crc kubenswrapper[4750]: I1008 19:32:21.851774 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 08 19:32:22 crc kubenswrapper[4750]: I1008 19:32:22.402531 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 08 19:32:22 crc kubenswrapper[4750]: W1008 19:32:22.413324 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd7bcb8c_0d88_45a3_8118_6c1cf36023a9.slice/crio-5559b9e1a086001464c08646c7333a8ce18fc40376017dac628d5f5bc9d03507 WatchSource:0}: Error finding container 5559b9e1a086001464c08646c7333a8ce18fc40376017dac628d5f5bc9d03507: Status 404 returned error can't find the container with id 5559b9e1a086001464c08646c7333a8ce18fc40376017dac628d5f5bc9d03507 Oct 08 19:32:22 crc kubenswrapper[4750]: I1008 19:32:22.688426 4750 generic.go:334] "Generic (PLEG): container finished" podID="fd7bcb8c-0d88-45a3-8118-6c1cf36023a9" containerID="6884c875b92de39684d14bb3858d43508ee4e28c09d770a4824ee11d381bed5b" exitCode=0 Oct 08 19:32:22 crc kubenswrapper[4750]: I1008 19:32:22.688519 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"fd7bcb8c-0d88-45a3-8118-6c1cf36023a9","Type":"ContainerDied","Data":"6884c875b92de39684d14bb3858d43508ee4e28c09d770a4824ee11d381bed5b"} Oct 08 19:32:22 crc kubenswrapper[4750]: I1008 19:32:22.688748 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"fd7bcb8c-0d88-45a3-8118-6c1cf36023a9","Type":"ContainerStarted","Data":"5559b9e1a086001464c08646c7333a8ce18fc40376017dac628d5f5bc9d03507"} Oct 08 19:32:22 crc kubenswrapper[4750]: I1008 19:32:22.750121 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd" path="/var/lib/kubelet/pods/6d8fb18f-204e-4dc4-b1b7-ee993a60fbfd/volumes" Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.485970 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.506633 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_fd7bcb8c-0d88-45a3-8118-6c1cf36023a9/mariadb-client-1/0.log" Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.570682 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.577987 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.650333 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvfcn\" (UniqueName: \"kubernetes.io/projected/fd7bcb8c-0d88-45a3-8118-6c1cf36023a9-kube-api-access-fvfcn\") pod \"fd7bcb8c-0d88-45a3-8118-6c1cf36023a9\" (UID: \"fd7bcb8c-0d88-45a3-8118-6c1cf36023a9\") " Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.660226 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd7bcb8c-0d88-45a3-8118-6c1cf36023a9-kube-api-access-fvfcn" (OuterVolumeSpecName: "kube-api-access-fvfcn") pod "fd7bcb8c-0d88-45a3-8118-6c1cf36023a9" (UID: "fd7bcb8c-0d88-45a3-8118-6c1cf36023a9"). InnerVolumeSpecName "kube-api-access-fvfcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.708651 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5559b9e1a086001464c08646c7333a8ce18fc40376017dac628d5f5bc9d03507" Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.708708 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.749670 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd7bcb8c-0d88-45a3-8118-6c1cf36023a9" path="/var/lib/kubelet/pods/fd7bcb8c-0d88-45a3-8118-6c1cf36023a9/volumes" Oct 08 19:32:24 crc kubenswrapper[4750]: I1008 19:32:24.752741 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvfcn\" (UniqueName: \"kubernetes.io/projected/fd7bcb8c-0d88-45a3-8118-6c1cf36023a9-kube-api-access-fvfcn\") on node \"crc\" DevicePath \"\"" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.038442 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 19:32:25 crc kubenswrapper[4750]: E1008 19:32:25.039062 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7bcb8c-0d88-45a3-8118-6c1cf36023a9" containerName="mariadb-client-1" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.039098 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7bcb8c-0d88-45a3-8118-6c1cf36023a9" containerName="mariadb-client-1" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.039400 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd7bcb8c-0d88-45a3-8118-6c1cf36023a9" containerName="mariadb-client-1" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.040432 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.045666 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5p88g" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.051319 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.160658 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8dk4\" (UniqueName: \"kubernetes.io/projected/42e2b8c1-00cb-4af0-bc85-2d797c121777-kube-api-access-p8dk4\") pod \"mariadb-client-4-default\" (UID: \"42e2b8c1-00cb-4af0-bc85-2d797c121777\") " pod="openstack/mariadb-client-4-default" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.262983 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8dk4\" (UniqueName: \"kubernetes.io/projected/42e2b8c1-00cb-4af0-bc85-2d797c121777-kube-api-access-p8dk4\") pod \"mariadb-client-4-default\" (UID: \"42e2b8c1-00cb-4af0-bc85-2d797c121777\") " pod="openstack/mariadb-client-4-default" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.292687 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8dk4\" (UniqueName: \"kubernetes.io/projected/42e2b8c1-00cb-4af0-bc85-2d797c121777-kube-api-access-p8dk4\") pod \"mariadb-client-4-default\" (UID: \"42e2b8c1-00cb-4af0-bc85-2d797c121777\") " pod="openstack/mariadb-client-4-default" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.376417 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 08 19:32:25 crc kubenswrapper[4750]: I1008 19:32:25.930161 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 19:32:26 crc kubenswrapper[4750]: W1008 19:32:26.214074 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42e2b8c1_00cb_4af0_bc85_2d797c121777.slice/crio-db68536a7e2e3862bc3cafb1dbc3ac36458ac3217b1444acab962897c45f9c4b WatchSource:0}: Error finding container db68536a7e2e3862bc3cafb1dbc3ac36458ac3217b1444acab962897c45f9c4b: Status 404 returned error can't find the container with id db68536a7e2e3862bc3cafb1dbc3ac36458ac3217b1444acab962897c45f9c4b Oct 08 19:32:26 crc kubenswrapper[4750]: I1008 19:32:26.727988 4750 generic.go:334] "Generic (PLEG): container finished" podID="42e2b8c1-00cb-4af0-bc85-2d797c121777" containerID="8ca3d0e5368280c47c33a5534b92001219cee415b61f74eb1b60b1b5b6f919a2" exitCode=0 Oct 08 19:32:26 crc kubenswrapper[4750]: I1008 19:32:26.728075 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"42e2b8c1-00cb-4af0-bc85-2d797c121777","Type":"ContainerDied","Data":"8ca3d0e5368280c47c33a5534b92001219cee415b61f74eb1b60b1b5b6f919a2"} Oct 08 19:32:26 crc kubenswrapper[4750]: I1008 19:32:26.728333 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"42e2b8c1-00cb-4af0-bc85-2d797c121777","Type":"ContainerStarted","Data":"db68536a7e2e3862bc3cafb1dbc3ac36458ac3217b1444acab962897c45f9c4b"} Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.183105 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.211207 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_42e2b8c1-00cb-4af0-bc85-2d797c121777/mariadb-client-4-default/0.log" Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.247608 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.255914 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.325779 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8dk4\" (UniqueName: \"kubernetes.io/projected/42e2b8c1-00cb-4af0-bc85-2d797c121777-kube-api-access-p8dk4\") pod \"42e2b8c1-00cb-4af0-bc85-2d797c121777\" (UID: \"42e2b8c1-00cb-4af0-bc85-2d797c121777\") " Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.333219 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e2b8c1-00cb-4af0-bc85-2d797c121777-kube-api-access-p8dk4" (OuterVolumeSpecName: "kube-api-access-p8dk4") pod "42e2b8c1-00cb-4af0-bc85-2d797c121777" (UID: "42e2b8c1-00cb-4af0-bc85-2d797c121777"). InnerVolumeSpecName "kube-api-access-p8dk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.427628 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8dk4\" (UniqueName: \"kubernetes.io/projected/42e2b8c1-00cb-4af0-bc85-2d797c121777-kube-api-access-p8dk4\") on node \"crc\" DevicePath \"\"" Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.745806 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.752847 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e2b8c1-00cb-4af0-bc85-2d797c121777" path="/var/lib/kubelet/pods/42e2b8c1-00cb-4af0-bc85-2d797c121777/volumes" Oct 08 19:32:28 crc kubenswrapper[4750]: I1008 19:32:28.753584 4750 scope.go:117] "RemoveContainer" containerID="8ca3d0e5368280c47c33a5534b92001219cee415b61f74eb1b60b1b5b6f919a2" Oct 08 19:32:29 crc kubenswrapper[4750]: I1008 19:32:29.707148 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:32:29 crc kubenswrapper[4750]: I1008 19:32:29.707244 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:32:29 crc kubenswrapper[4750]: I1008 19:32:29.707310 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:32:29 crc kubenswrapper[4750]: I1008 19:32:29.708170 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1641173c95d4171a3b1be3458fb6da9f585cc8093883355dc63c1b1b9fca71ea"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:32:29 crc kubenswrapper[4750]: I1008 19:32:29.708234 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://1641173c95d4171a3b1be3458fb6da9f585cc8093883355dc63c1b1b9fca71ea" gracePeriod=600 Oct 08 19:32:30 crc kubenswrapper[4750]: I1008 19:32:30.766936 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="1641173c95d4171a3b1be3458fb6da9f585cc8093883355dc63c1b1b9fca71ea" exitCode=0 Oct 08 19:32:30 crc kubenswrapper[4750]: I1008 19:32:30.766985 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"1641173c95d4171a3b1be3458fb6da9f585cc8093883355dc63c1b1b9fca71ea"} Oct 08 19:32:30 crc kubenswrapper[4750]: I1008 19:32:30.767875 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a"} Oct 08 19:32:30 crc kubenswrapper[4750]: I1008 19:32:30.767909 4750 scope.go:117] "RemoveContainer" containerID="2d501006b0e8fb04661a88c708dbd92c92878af845524f77124aac3cc6ff30bc" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.598991 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 19:32:32 crc kubenswrapper[4750]: E1008 19:32:32.599802 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e2b8c1-00cb-4af0-bc85-2d797c121777" containerName="mariadb-client-4-default" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.599821 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e2b8c1-00cb-4af0-bc85-2d797c121777" containerName="mariadb-client-4-default" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.600073 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e2b8c1-00cb-4af0-bc85-2d797c121777" containerName="mariadb-client-4-default" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.600779 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.602946 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5p88g" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.617728 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.705892 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7s7\" (UniqueName: \"kubernetes.io/projected/a525d8ec-53b5-4b15-a804-0fc0a8cb8a99-kube-api-access-sd7s7\") pod \"mariadb-client-5-default\" (UID: \"a525d8ec-53b5-4b15-a804-0fc0a8cb8a99\") " pod="openstack/mariadb-client-5-default" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.807876 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7s7\" (UniqueName: \"kubernetes.io/projected/a525d8ec-53b5-4b15-a804-0fc0a8cb8a99-kube-api-access-sd7s7\") pod \"mariadb-client-5-default\" (UID: \"a525d8ec-53b5-4b15-a804-0fc0a8cb8a99\") " pod="openstack/mariadb-client-5-default" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.838601 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7s7\" (UniqueName: \"kubernetes.io/projected/a525d8ec-53b5-4b15-a804-0fc0a8cb8a99-kube-api-access-sd7s7\") pod \"mariadb-client-5-default\" (UID: \"a525d8ec-53b5-4b15-a804-0fc0a8cb8a99\") " pod="openstack/mariadb-client-5-default" Oct 08 19:32:32 crc kubenswrapper[4750]: I1008 19:32:32.937833 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 08 19:32:33 crc kubenswrapper[4750]: I1008 19:32:33.465736 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 19:32:33 crc kubenswrapper[4750]: W1008 19:32:33.467943 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda525d8ec_53b5_4b15_a804_0fc0a8cb8a99.slice/crio-4320c379890f3db3df4ff0f2bc427fcad2c60a9c8fcec64902f1bb8fe53c7464 WatchSource:0}: Error finding container 4320c379890f3db3df4ff0f2bc427fcad2c60a9c8fcec64902f1bb8fe53c7464: Status 404 returned error can't find the container with id 4320c379890f3db3df4ff0f2bc427fcad2c60a9c8fcec64902f1bb8fe53c7464 Oct 08 19:32:33 crc kubenswrapper[4750]: I1008 19:32:33.813179 4750 generic.go:334] "Generic (PLEG): container finished" podID="a525d8ec-53b5-4b15-a804-0fc0a8cb8a99" containerID="c83adb2e8c78b3f8169342cdfdd81d447670a1af31be2b1690b870d77bef4f09" exitCode=0 Oct 08 19:32:33 crc kubenswrapper[4750]: I1008 19:32:33.813267 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"a525d8ec-53b5-4b15-a804-0fc0a8cb8a99","Type":"ContainerDied","Data":"c83adb2e8c78b3f8169342cdfdd81d447670a1af31be2b1690b870d77bef4f09"} Oct 08 19:32:33 crc kubenswrapper[4750]: I1008 19:32:33.813345 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"a525d8ec-53b5-4b15-a804-0fc0a8cb8a99","Type":"ContainerStarted","Data":"4320c379890f3db3df4ff0f2bc427fcad2c60a9c8fcec64902f1bb8fe53c7464"} Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.212495 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.239427 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_a525d8ec-53b5-4b15-a804-0fc0a8cb8a99/mariadb-client-5-default/0.log" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.268434 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.274747 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.352956 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd7s7\" (UniqueName: \"kubernetes.io/projected/a525d8ec-53b5-4b15-a804-0fc0a8cb8a99-kube-api-access-sd7s7\") pod \"a525d8ec-53b5-4b15-a804-0fc0a8cb8a99\" (UID: \"a525d8ec-53b5-4b15-a804-0fc0a8cb8a99\") " Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.370182 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a525d8ec-53b5-4b15-a804-0fc0a8cb8a99-kube-api-access-sd7s7" (OuterVolumeSpecName: "kube-api-access-sd7s7") pod "a525d8ec-53b5-4b15-a804-0fc0a8cb8a99" (UID: "a525d8ec-53b5-4b15-a804-0fc0a8cb8a99"). InnerVolumeSpecName "kube-api-access-sd7s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.402138 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 19:32:35 crc kubenswrapper[4750]: E1008 19:32:35.402596 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a525d8ec-53b5-4b15-a804-0fc0a8cb8a99" containerName="mariadb-client-5-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.402614 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a525d8ec-53b5-4b15-a804-0fc0a8cb8a99" containerName="mariadb-client-5-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.402793 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a525d8ec-53b5-4b15-a804-0fc0a8cb8a99" containerName="mariadb-client-5-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.403481 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.413497 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.455329 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd7s7\" (UniqueName: \"kubernetes.io/projected/a525d8ec-53b5-4b15-a804-0fc0a8cb8a99-kube-api-access-sd7s7\") on node \"crc\" DevicePath \"\"" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.557223 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmhnt\" (UniqueName: \"kubernetes.io/projected/fc3686d9-1f85-402c-aed1-668110c99eea-kube-api-access-vmhnt\") pod \"mariadb-client-6-default\" (UID: \"fc3686d9-1f85-402c-aed1-668110c99eea\") " pod="openstack/mariadb-client-6-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.659345 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmhnt\" (UniqueName: \"kubernetes.io/projected/fc3686d9-1f85-402c-aed1-668110c99eea-kube-api-access-vmhnt\") pod \"mariadb-client-6-default\" (UID: \"fc3686d9-1f85-402c-aed1-668110c99eea\") " pod="openstack/mariadb-client-6-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.678628 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmhnt\" (UniqueName: \"kubernetes.io/projected/fc3686d9-1f85-402c-aed1-668110c99eea-kube-api-access-vmhnt\") pod \"mariadb-client-6-default\" (UID: \"fc3686d9-1f85-402c-aed1-668110c99eea\") " pod="openstack/mariadb-client-6-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.745089 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.852723 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4320c379890f3db3df4ff0f2bc427fcad2c60a9c8fcec64902f1bb8fe53c7464" Oct 08 19:32:35 crc kubenswrapper[4750]: I1008 19:32:35.853274 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 08 19:32:36 crc kubenswrapper[4750]: I1008 19:32:36.264010 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 19:32:36 crc kubenswrapper[4750]: I1008 19:32:36.760022 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a525d8ec-53b5-4b15-a804-0fc0a8cb8a99" path="/var/lib/kubelet/pods/a525d8ec-53b5-4b15-a804-0fc0a8cb8a99/volumes" Oct 08 19:32:36 crc kubenswrapper[4750]: I1008 19:32:36.874449 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"fc3686d9-1f85-402c-aed1-668110c99eea","Type":"ContainerStarted","Data":"1fd8455bed30a674374e17da2eabd2825de4b4d3e7112dd8295134025a705397"} Oct 08 19:32:36 crc kubenswrapper[4750]: I1008 19:32:36.874500 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"fc3686d9-1f85-402c-aed1-668110c99eea","Type":"ContainerStarted","Data":"dcfa730920273a1228d5b0a8b265d635904474ac6c026027f070243fe90713b9"} Oct 08 19:32:36 crc kubenswrapper[4750]: I1008 19:32:36.902419 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.902390194 podStartE2EDuration="1.902390194s" podCreationTimestamp="2025-10-08 19:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:32:36.899466 +0000 UTC m=+4912.812437033" watchObservedRunningTime="2025-10-08 19:32:36.902390194 +0000 UTC m=+4912.815361207" Oct 08 19:32:37 crc kubenswrapper[4750]: I1008 19:32:37.884632 4750 generic.go:334] "Generic (PLEG): container finished" podID="fc3686d9-1f85-402c-aed1-668110c99eea" containerID="1fd8455bed30a674374e17da2eabd2825de4b4d3e7112dd8295134025a705397" exitCode=0 Oct 08 19:32:37 crc kubenswrapper[4750]: I1008 19:32:37.884711 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"fc3686d9-1f85-402c-aed1-668110c99eea","Type":"ContainerDied","Data":"1fd8455bed30a674374e17da2eabd2825de4b4d3e7112dd8295134025a705397"} Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.451777 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.510149 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.520427 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.540520 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmhnt\" (UniqueName: \"kubernetes.io/projected/fc3686d9-1f85-402c-aed1-668110c99eea-kube-api-access-vmhnt\") pod \"fc3686d9-1f85-402c-aed1-668110c99eea\" (UID: \"fc3686d9-1f85-402c-aed1-668110c99eea\") " Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.547926 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3686d9-1f85-402c-aed1-668110c99eea-kube-api-access-vmhnt" (OuterVolumeSpecName: "kube-api-access-vmhnt") pod "fc3686d9-1f85-402c-aed1-668110c99eea" (UID: "fc3686d9-1f85-402c-aed1-668110c99eea"). InnerVolumeSpecName "kube-api-access-vmhnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.644260 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmhnt\" (UniqueName: \"kubernetes.io/projected/fc3686d9-1f85-402c-aed1-668110c99eea-kube-api-access-vmhnt\") on node \"crc\" DevicePath \"\"" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.703196 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 19:32:39 crc kubenswrapper[4750]: E1008 19:32:39.703608 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3686d9-1f85-402c-aed1-668110c99eea" containerName="mariadb-client-6-default" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.703626 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3686d9-1f85-402c-aed1-668110c99eea" containerName="mariadb-client-6-default" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.703786 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3686d9-1f85-402c-aed1-668110c99eea" containerName="mariadb-client-6-default" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.704417 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.719181 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.847970 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khlxd\" (UniqueName: \"kubernetes.io/projected/28f7d5c2-0103-4874-8ed7-92dd54129f5a-kube-api-access-khlxd\") pod \"mariadb-client-7-default\" (UID: \"28f7d5c2-0103-4874-8ed7-92dd54129f5a\") " pod="openstack/mariadb-client-7-default" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.902604 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcfa730920273a1228d5b0a8b265d635904474ac6c026027f070243fe90713b9" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.902745 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.950436 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khlxd\" (UniqueName: \"kubernetes.io/projected/28f7d5c2-0103-4874-8ed7-92dd54129f5a-kube-api-access-khlxd\") pod \"mariadb-client-7-default\" (UID: \"28f7d5c2-0103-4874-8ed7-92dd54129f5a\") " pod="openstack/mariadb-client-7-default" Oct 08 19:32:39 crc kubenswrapper[4750]: I1008 19:32:39.970956 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khlxd\" (UniqueName: \"kubernetes.io/projected/28f7d5c2-0103-4874-8ed7-92dd54129f5a-kube-api-access-khlxd\") pod \"mariadb-client-7-default\" (UID: \"28f7d5c2-0103-4874-8ed7-92dd54129f5a\") " pod="openstack/mariadb-client-7-default" Oct 08 19:32:40 crc kubenswrapper[4750]: I1008 19:32:40.024097 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 08 19:32:40 crc kubenswrapper[4750]: I1008 19:32:40.564262 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 19:32:40 crc kubenswrapper[4750]: W1008 19:32:40.574952 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28f7d5c2_0103_4874_8ed7_92dd54129f5a.slice/crio-a0143b8ee01e7c703385329f72439c03fde538d93be4cb29750deff063911249 WatchSource:0}: Error finding container a0143b8ee01e7c703385329f72439c03fde538d93be4cb29750deff063911249: Status 404 returned error can't find the container with id a0143b8ee01e7c703385329f72439c03fde538d93be4cb29750deff063911249 Oct 08 19:32:40 crc kubenswrapper[4750]: I1008 19:32:40.746088 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3686d9-1f85-402c-aed1-668110c99eea" path="/var/lib/kubelet/pods/fc3686d9-1f85-402c-aed1-668110c99eea/volumes" Oct 08 19:32:40 crc kubenswrapper[4750]: I1008 19:32:40.913332 4750 generic.go:334] "Generic (PLEG): container finished" podID="28f7d5c2-0103-4874-8ed7-92dd54129f5a" containerID="12bfdf92fbb175d223de71a1b3618f42edadcf8bb854dd7447d8d071e994fce4" exitCode=0 Oct 08 19:32:40 crc kubenswrapper[4750]: I1008 19:32:40.913388 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"28f7d5c2-0103-4874-8ed7-92dd54129f5a","Type":"ContainerDied","Data":"12bfdf92fbb175d223de71a1b3618f42edadcf8bb854dd7447d8d071e994fce4"} Oct 08 19:32:40 crc kubenswrapper[4750]: I1008 19:32:40.913423 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"28f7d5c2-0103-4874-8ed7-92dd54129f5a","Type":"ContainerStarted","Data":"a0143b8ee01e7c703385329f72439c03fde538d93be4cb29750deff063911249"} Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.351171 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.376446 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_28f7d5c2-0103-4874-8ed7-92dd54129f5a/mariadb-client-7-default/0.log" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.402238 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.410285 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.501023 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khlxd\" (UniqueName: \"kubernetes.io/projected/28f7d5c2-0103-4874-8ed7-92dd54129f5a-kube-api-access-khlxd\") pod \"28f7d5c2-0103-4874-8ed7-92dd54129f5a\" (UID: \"28f7d5c2-0103-4874-8ed7-92dd54129f5a\") " Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.511194 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f7d5c2-0103-4874-8ed7-92dd54129f5a-kube-api-access-khlxd" (OuterVolumeSpecName: "kube-api-access-khlxd") pod "28f7d5c2-0103-4874-8ed7-92dd54129f5a" (UID: "28f7d5c2-0103-4874-8ed7-92dd54129f5a"). InnerVolumeSpecName "kube-api-access-khlxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.572786 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 08 19:32:42 crc kubenswrapper[4750]: E1008 19:32:42.573267 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f7d5c2-0103-4874-8ed7-92dd54129f5a" containerName="mariadb-client-7-default" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.573288 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f7d5c2-0103-4874-8ed7-92dd54129f5a" containerName="mariadb-client-7-default" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.573494 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f7d5c2-0103-4874-8ed7-92dd54129f5a" containerName="mariadb-client-7-default" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.574238 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.589054 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.603656 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khlxd\" (UniqueName: \"kubernetes.io/projected/28f7d5c2-0103-4874-8ed7-92dd54129f5a-kube-api-access-khlxd\") on node \"crc\" DevicePath \"\"" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.704974 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqm6w\" (UniqueName: \"kubernetes.io/projected/b58837f2-7a2e-4676-b598-6ac7eb77c494-kube-api-access-gqm6w\") pod \"mariadb-client-2\" (UID: \"b58837f2-7a2e-4676-b598-6ac7eb77c494\") " pod="openstack/mariadb-client-2" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.744917 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f7d5c2-0103-4874-8ed7-92dd54129f5a" path="/var/lib/kubelet/pods/28f7d5c2-0103-4874-8ed7-92dd54129f5a/volumes" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.806969 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqm6w\" (UniqueName: \"kubernetes.io/projected/b58837f2-7a2e-4676-b598-6ac7eb77c494-kube-api-access-gqm6w\") pod \"mariadb-client-2\" (UID: \"b58837f2-7a2e-4676-b598-6ac7eb77c494\") " pod="openstack/mariadb-client-2" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.829167 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqm6w\" (UniqueName: \"kubernetes.io/projected/b58837f2-7a2e-4676-b598-6ac7eb77c494-kube-api-access-gqm6w\") pod \"mariadb-client-2\" (UID: \"b58837f2-7a2e-4676-b598-6ac7eb77c494\") " pod="openstack/mariadb-client-2" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.895536 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.933861 4750 scope.go:117] "RemoveContainer" containerID="12bfdf92fbb175d223de71a1b3618f42edadcf8bb854dd7447d8d071e994fce4" Oct 08 19:32:42 crc kubenswrapper[4750]: I1008 19:32:42.934072 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 08 19:32:43 crc kubenswrapper[4750]: I1008 19:32:43.432078 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 08 19:32:43 crc kubenswrapper[4750]: I1008 19:32:43.944341 4750 generic.go:334] "Generic (PLEG): container finished" podID="b58837f2-7a2e-4676-b598-6ac7eb77c494" containerID="6a177b9ea171a3b77046c64dc9ed918a7a6e04f4f09cdcfd2dad1faf4e1d67e9" exitCode=0 Oct 08 19:32:43 crc kubenswrapper[4750]: I1008 19:32:43.944434 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"b58837f2-7a2e-4676-b598-6ac7eb77c494","Type":"ContainerDied","Data":"6a177b9ea171a3b77046c64dc9ed918a7a6e04f4f09cdcfd2dad1faf4e1d67e9"} Oct 08 19:32:43 crc kubenswrapper[4750]: I1008 19:32:43.944830 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"b58837f2-7a2e-4676-b598-6ac7eb77c494","Type":"ContainerStarted","Data":"cb7ea942d0f270558dfdc0073e2f2991ec3e8aadb99864ee18c4394f341f7bb3"} Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.375105 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.397915 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_b58837f2-7a2e-4676-b598-6ac7eb77c494/mariadb-client-2/0.log" Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.428842 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.438021 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.555203 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqm6w\" (UniqueName: \"kubernetes.io/projected/b58837f2-7a2e-4676-b598-6ac7eb77c494-kube-api-access-gqm6w\") pod \"b58837f2-7a2e-4676-b598-6ac7eb77c494\" (UID: \"b58837f2-7a2e-4676-b598-6ac7eb77c494\") " Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.561941 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58837f2-7a2e-4676-b598-6ac7eb77c494-kube-api-access-gqm6w" (OuterVolumeSpecName: "kube-api-access-gqm6w") pod "b58837f2-7a2e-4676-b598-6ac7eb77c494" (UID: "b58837f2-7a2e-4676-b598-6ac7eb77c494"). InnerVolumeSpecName "kube-api-access-gqm6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.657162 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqm6w\" (UniqueName: \"kubernetes.io/projected/b58837f2-7a2e-4676-b598-6ac7eb77c494-kube-api-access-gqm6w\") on node \"crc\" DevicePath \"\"" Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.970833 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb7ea942d0f270558dfdc0073e2f2991ec3e8aadb99864ee18c4394f341f7bb3" Oct 08 19:32:45 crc kubenswrapper[4750]: I1008 19:32:45.970950 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 08 19:32:46 crc kubenswrapper[4750]: I1008 19:32:46.746366 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58837f2-7a2e-4676-b598-6ac7eb77c494" path="/var/lib/kubelet/pods/b58837f2-7a2e-4676-b598-6ac7eb77c494/volumes" Oct 08 19:33:02 crc kubenswrapper[4750]: I1008 19:33:02.843477 4750 scope.go:117] "RemoveContainer" containerID="fcfc556e6b64a14753ba1448420aa325a1d13206de48f3ab2121b7dcfdb22491" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.671366 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm9m"] Oct 08 19:34:57 crc kubenswrapper[4750]: E1008 19:34:57.673271 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58837f2-7a2e-4676-b598-6ac7eb77c494" containerName="mariadb-client-2" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.673310 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58837f2-7a2e-4676-b598-6ac7eb77c494" containerName="mariadb-client-2" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.673850 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58837f2-7a2e-4676-b598-6ac7eb77c494" containerName="mariadb-client-2" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.682216 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.687716 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm9m"] Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.774542 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-utilities\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.774677 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-catalog-content\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.774725 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzhc4\" (UniqueName: \"kubernetes.io/projected/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-kube-api-access-dzhc4\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.877460 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-utilities\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.877649 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-catalog-content\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.877742 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzhc4\" (UniqueName: \"kubernetes.io/projected/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-kube-api-access-dzhc4\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.878193 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-utilities\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.878218 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-catalog-content\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:57 crc kubenswrapper[4750]: I1008 19:34:57.912587 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzhc4\" (UniqueName: \"kubernetes.io/projected/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-kube-api-access-dzhc4\") pod \"redhat-marketplace-7pm9m\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:58 crc kubenswrapper[4750]: I1008 19:34:58.023175 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:34:58 crc kubenswrapper[4750]: I1008 19:34:58.476294 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm9m"] Oct 08 19:34:59 crc kubenswrapper[4750]: I1008 19:34:59.363531 4750 generic.go:334] "Generic (PLEG): container finished" podID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerID="747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37" exitCode=0 Oct 08 19:34:59 crc kubenswrapper[4750]: I1008 19:34:59.363610 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm9m" event={"ID":"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79","Type":"ContainerDied","Data":"747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37"} Oct 08 19:34:59 crc kubenswrapper[4750]: I1008 19:34:59.363645 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm9m" event={"ID":"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79","Type":"ContainerStarted","Data":"c4073eb8f123e44023b255cc1986ad9bc216cce08be1863e1c9cf75201a2e052"} Oct 08 19:34:59 crc kubenswrapper[4750]: I1008 19:34:59.707401 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:34:59 crc kubenswrapper[4750]: I1008 19:34:59.707481 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:35:01 crc kubenswrapper[4750]: I1008 19:35:01.389695 4750 generic.go:334] "Generic (PLEG): container finished" podID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerID="b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4" exitCode=0 Oct 08 19:35:01 crc kubenswrapper[4750]: I1008 19:35:01.389833 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm9m" event={"ID":"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79","Type":"ContainerDied","Data":"b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4"} Oct 08 19:35:06 crc kubenswrapper[4750]: I1008 19:35:06.441969 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm9m" event={"ID":"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79","Type":"ContainerStarted","Data":"897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d"} Oct 08 19:35:06 crc kubenswrapper[4750]: I1008 19:35:06.488103 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7pm9m" podStartSLOduration=2.771255806 podStartE2EDuration="9.488075987s" podCreationTimestamp="2025-10-08 19:34:57 +0000 UTC" firstStartedPulling="2025-10-08 19:34:59.368001596 +0000 UTC m=+5055.280972619" lastFinishedPulling="2025-10-08 19:35:06.084821787 +0000 UTC m=+5061.997792800" observedRunningTime="2025-10-08 19:35:06.474131747 +0000 UTC m=+5062.387102820" watchObservedRunningTime="2025-10-08 19:35:06.488075987 +0000 UTC m=+5062.401047010" Oct 08 19:35:08 crc kubenswrapper[4750]: I1008 19:35:08.024369 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:35:08 crc kubenswrapper[4750]: I1008 19:35:08.025384 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:35:08 crc kubenswrapper[4750]: I1008 19:35:08.091332 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:35:18 crc kubenswrapper[4750]: I1008 19:35:18.087768 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:35:18 crc kubenswrapper[4750]: I1008 19:35:18.165935 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm9m"] Oct 08 19:35:18 crc kubenswrapper[4750]: I1008 19:35:18.570262 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7pm9m" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerName="registry-server" containerID="cri-o://897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d" gracePeriod=2 Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.014855 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.191408 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-catalog-content\") pod \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.191731 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-utilities\") pod \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.191955 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzhc4\" (UniqueName: \"kubernetes.io/projected/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-kube-api-access-dzhc4\") pod \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\" (UID: \"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79\") " Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.192978 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-utilities" (OuterVolumeSpecName: "utilities") pod "4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" (UID: "4bc2782d-9bfe-4b2b-8e55-b93a020c3b79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.203619 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-kube-api-access-dzhc4" (OuterVolumeSpecName: "kube-api-access-dzhc4") pod "4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" (UID: "4bc2782d-9bfe-4b2b-8e55-b93a020c3b79"). InnerVolumeSpecName "kube-api-access-dzhc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.219169 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" (UID: "4bc2782d-9bfe-4b2b-8e55-b93a020c3b79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.293716 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzhc4\" (UniqueName: \"kubernetes.io/projected/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-kube-api-access-dzhc4\") on node \"crc\" DevicePath \"\"" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.293758 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.293771 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.585626 4750 generic.go:334] "Generic (PLEG): container finished" podID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerID="897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d" exitCode=0 Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.585708 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm9m" event={"ID":"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79","Type":"ContainerDied","Data":"897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d"} Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.585744 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pm9m" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.586297 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm9m" event={"ID":"4bc2782d-9bfe-4b2b-8e55-b93a020c3b79","Type":"ContainerDied","Data":"c4073eb8f123e44023b255cc1986ad9bc216cce08be1863e1c9cf75201a2e052"} Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.586345 4750 scope.go:117] "RemoveContainer" containerID="897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.631622 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm9m"] Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.632774 4750 scope.go:117] "RemoveContainer" containerID="b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.638945 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm9m"] Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.662580 4750 scope.go:117] "RemoveContainer" containerID="747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.704933 4750 scope.go:117] "RemoveContainer" containerID="897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d" Oct 08 19:35:19 crc kubenswrapper[4750]: E1008 19:35:19.705430 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d\": container with ID starting with 897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d not found: ID does not exist" containerID="897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.705471 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d"} err="failed to get container status \"897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d\": rpc error: code = NotFound desc = could not find container \"897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d\": container with ID starting with 897cee143757156b5f432bb730ea584668b4cb7d4e0557a943fcfc5e8bed6b2d not found: ID does not exist" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.705494 4750 scope.go:117] "RemoveContainer" containerID="b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4" Oct 08 19:35:19 crc kubenswrapper[4750]: E1008 19:35:19.705980 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4\": container with ID starting with b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4 not found: ID does not exist" containerID="b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.706015 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4"} err="failed to get container status \"b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4\": rpc error: code = NotFound desc = could not find container \"b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4\": container with ID starting with b3bc4a8a57ac6fbe785d23524c356ec3de8f793645a027dd55274e9ec31ee2e4 not found: ID does not exist" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.706036 4750 scope.go:117] "RemoveContainer" containerID="747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37" Oct 08 19:35:19 crc kubenswrapper[4750]: E1008 19:35:19.706381 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37\": container with ID starting with 747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37 not found: ID does not exist" containerID="747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37" Oct 08 19:35:19 crc kubenswrapper[4750]: I1008 19:35:19.706404 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37"} err="failed to get container status \"747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37\": rpc error: code = NotFound desc = could not find container \"747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37\": container with ID starting with 747dff29716b9aef347d9fec416e8821c79a5719eecfe1995888dbf683faec37 not found: ID does not exist" Oct 08 19:35:20 crc kubenswrapper[4750]: I1008 19:35:20.753188 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" path="/var/lib/kubelet/pods/4bc2782d-9bfe-4b2b-8e55-b93a020c3b79/volumes" Oct 08 19:35:29 crc kubenswrapper[4750]: I1008 19:35:29.706992 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:35:29 crc kubenswrapper[4750]: I1008 19:35:29.707893 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:35:59 crc kubenswrapper[4750]: I1008 19:35:59.707131 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:35:59 crc kubenswrapper[4750]: I1008 19:35:59.707876 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:35:59 crc kubenswrapper[4750]: I1008 19:35:59.707947 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:35:59 crc kubenswrapper[4750]: I1008 19:35:59.708732 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:35:59 crc kubenswrapper[4750]: I1008 19:35:59.708796 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" gracePeriod=600 Oct 08 19:35:59 crc kubenswrapper[4750]: E1008 19:35:59.841789 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:36:00 crc kubenswrapper[4750]: I1008 19:36:00.015980 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" exitCode=0 Oct 08 19:36:00 crc kubenswrapper[4750]: I1008 19:36:00.016038 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a"} Oct 08 19:36:00 crc kubenswrapper[4750]: I1008 19:36:00.016080 4750 scope.go:117] "RemoveContainer" containerID="1641173c95d4171a3b1be3458fb6da9f585cc8093883355dc63c1b1b9fca71ea" Oct 08 19:36:00 crc kubenswrapper[4750]: I1008 19:36:00.016743 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:36:00 crc kubenswrapper[4750]: E1008 19:36:00.017261 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:36:11 crc kubenswrapper[4750]: I1008 19:36:11.734773 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:36:11 crc kubenswrapper[4750]: E1008 19:36:11.736995 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.681824 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-csh78"] Oct 08 19:36:16 crc kubenswrapper[4750]: E1008 19:36:16.683312 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerName="extract-content" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.683337 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerName="extract-content" Oct 08 19:36:16 crc kubenswrapper[4750]: E1008 19:36:16.683375 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerName="extract-utilities" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.683388 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerName="extract-utilities" Oct 08 19:36:16 crc kubenswrapper[4750]: E1008 19:36:16.683419 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerName="registry-server" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.683432 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerName="registry-server" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.683780 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc2782d-9bfe-4b2b-8e55-b93a020c3b79" containerName="registry-server" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.686138 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.709320 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csh78"] Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.718263 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-utilities\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.718519 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-catalog-content\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.718928 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcmt9\" (UniqueName: \"kubernetes.io/projected/0676fec7-c711-4003-a560-c6ccef095934-kube-api-access-gcmt9\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.820096 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-catalog-content\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.820229 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcmt9\" (UniqueName: \"kubernetes.io/projected/0676fec7-c711-4003-a560-c6ccef095934-kube-api-access-gcmt9\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.820276 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-utilities\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.820821 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-utilities\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.822160 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-catalog-content\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:16 crc kubenswrapper[4750]: I1008 19:36:16.843399 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcmt9\" (UniqueName: \"kubernetes.io/projected/0676fec7-c711-4003-a560-c6ccef095934-kube-api-access-gcmt9\") pod \"redhat-operators-csh78\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:17 crc kubenswrapper[4750]: I1008 19:36:17.018656 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:17 crc kubenswrapper[4750]: I1008 19:36:17.493007 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csh78"] Oct 08 19:36:18 crc kubenswrapper[4750]: I1008 19:36:18.196194 4750 generic.go:334] "Generic (PLEG): container finished" podID="0676fec7-c711-4003-a560-c6ccef095934" containerID="19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29" exitCode=0 Oct 08 19:36:18 crc kubenswrapper[4750]: I1008 19:36:18.196246 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csh78" event={"ID":"0676fec7-c711-4003-a560-c6ccef095934","Type":"ContainerDied","Data":"19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29"} Oct 08 19:36:18 crc kubenswrapper[4750]: I1008 19:36:18.196813 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csh78" event={"ID":"0676fec7-c711-4003-a560-c6ccef095934","Type":"ContainerStarted","Data":"df3adbe527a598ec6f27b31cc686cc0c5203848ead060aea19ed0c34814f8f10"} Oct 08 19:36:20 crc kubenswrapper[4750]: I1008 19:36:20.221787 4750 generic.go:334] "Generic (PLEG): container finished" podID="0676fec7-c711-4003-a560-c6ccef095934" containerID="07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b" exitCode=0 Oct 08 19:36:20 crc kubenswrapper[4750]: I1008 19:36:20.221908 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csh78" event={"ID":"0676fec7-c711-4003-a560-c6ccef095934","Type":"ContainerDied","Data":"07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b"} Oct 08 19:36:21 crc kubenswrapper[4750]: I1008 19:36:21.237267 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csh78" event={"ID":"0676fec7-c711-4003-a560-c6ccef095934","Type":"ContainerStarted","Data":"e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b"} Oct 08 19:36:24 crc kubenswrapper[4750]: I1008 19:36:24.740314 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:36:24 crc kubenswrapper[4750]: E1008 19:36:24.742581 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:36:27 crc kubenswrapper[4750]: I1008 19:36:27.019413 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:27 crc kubenswrapper[4750]: I1008 19:36:27.019495 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:27 crc kubenswrapper[4750]: I1008 19:36:27.101748 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:27 crc kubenswrapper[4750]: I1008 19:36:27.130913 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-csh78" podStartSLOduration=8.478386429 podStartE2EDuration="11.130882939s" podCreationTimestamp="2025-10-08 19:36:16 +0000 UTC" firstStartedPulling="2025-10-08 19:36:18.199846228 +0000 UTC m=+5134.112817281" lastFinishedPulling="2025-10-08 19:36:20.852342778 +0000 UTC m=+5136.765313791" observedRunningTime="2025-10-08 19:36:21.261130067 +0000 UTC m=+5137.174101100" watchObservedRunningTime="2025-10-08 19:36:27.130882939 +0000 UTC m=+5143.043853962" Oct 08 19:36:27 crc kubenswrapper[4750]: I1008 19:36:27.353307 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:27 crc kubenswrapper[4750]: I1008 19:36:27.401985 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csh78"] Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.318624 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-csh78" podUID="0676fec7-c711-4003-a560-c6ccef095934" containerName="registry-server" containerID="cri-o://e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b" gracePeriod=2 Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.787133 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.862802 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-utilities\") pod \"0676fec7-c711-4003-a560-c6ccef095934\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.862913 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcmt9\" (UniqueName: \"kubernetes.io/projected/0676fec7-c711-4003-a560-c6ccef095934-kube-api-access-gcmt9\") pod \"0676fec7-c711-4003-a560-c6ccef095934\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.863007 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-catalog-content\") pod \"0676fec7-c711-4003-a560-c6ccef095934\" (UID: \"0676fec7-c711-4003-a560-c6ccef095934\") " Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.864219 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-utilities" (OuterVolumeSpecName: "utilities") pod "0676fec7-c711-4003-a560-c6ccef095934" (UID: "0676fec7-c711-4003-a560-c6ccef095934"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.870982 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0676fec7-c711-4003-a560-c6ccef095934-kube-api-access-gcmt9" (OuterVolumeSpecName: "kube-api-access-gcmt9") pod "0676fec7-c711-4003-a560-c6ccef095934" (UID: "0676fec7-c711-4003-a560-c6ccef095934"). InnerVolumeSpecName "kube-api-access-gcmt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.964727 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:36:29 crc kubenswrapper[4750]: I1008 19:36:29.964769 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcmt9\" (UniqueName: \"kubernetes.io/projected/0676fec7-c711-4003-a560-c6ccef095934-kube-api-access-gcmt9\") on node \"crc\" DevicePath \"\"" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.333116 4750 generic.go:334] "Generic (PLEG): container finished" podID="0676fec7-c711-4003-a560-c6ccef095934" containerID="e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b" exitCode=0 Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.333193 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csh78" event={"ID":"0676fec7-c711-4003-a560-c6ccef095934","Type":"ContainerDied","Data":"e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b"} Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.333256 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csh78" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.333762 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csh78" event={"ID":"0676fec7-c711-4003-a560-c6ccef095934","Type":"ContainerDied","Data":"df3adbe527a598ec6f27b31cc686cc0c5203848ead060aea19ed0c34814f8f10"} Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.333813 4750 scope.go:117] "RemoveContainer" containerID="e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.378897 4750 scope.go:117] "RemoveContainer" containerID="07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.406105 4750 scope.go:117] "RemoveContainer" containerID="19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.456720 4750 scope.go:117] "RemoveContainer" containerID="e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b" Oct 08 19:36:30 crc kubenswrapper[4750]: E1008 19:36:30.460782 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b\": container with ID starting with e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b not found: ID does not exist" containerID="e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.460831 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b"} err="failed to get container status \"e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b\": rpc error: code = NotFound desc = could not find container \"e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b\": container with ID starting with e4cc6fb1243f42b900f8dc0caef18e34485360226aefcbb890802e62ef4b0a9b not found: ID does not exist" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.460863 4750 scope.go:117] "RemoveContainer" containerID="07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b" Oct 08 19:36:30 crc kubenswrapper[4750]: E1008 19:36:30.461284 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b\": container with ID starting with 07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b not found: ID does not exist" containerID="07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.461315 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b"} err="failed to get container status \"07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b\": rpc error: code = NotFound desc = could not find container \"07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b\": container with ID starting with 07522962d545900268039a011fcc3e2477d52bc5d12cf683cffdef2afbf2bc3b not found: ID does not exist" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.461336 4750 scope.go:117] "RemoveContainer" containerID="19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29" Oct 08 19:36:30 crc kubenswrapper[4750]: E1008 19:36:30.461775 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29\": container with ID starting with 19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29 not found: ID does not exist" containerID="19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29" Oct 08 19:36:30 crc kubenswrapper[4750]: I1008 19:36:30.461806 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29"} err="failed to get container status \"19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29\": rpc error: code = NotFound desc = could not find container \"19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29\": container with ID starting with 19de009d1783e1f30fcc8122d567f4c3f5982a72ca7aad6870193afe440bfa29 not found: ID does not exist" Oct 08 19:36:31 crc kubenswrapper[4750]: I1008 19:36:31.028141 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0676fec7-c711-4003-a560-c6ccef095934" (UID: "0676fec7-c711-4003-a560-c6ccef095934"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:36:31 crc kubenswrapper[4750]: I1008 19:36:31.086365 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0676fec7-c711-4003-a560-c6ccef095934-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:36:31 crc kubenswrapper[4750]: I1008 19:36:31.275199 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csh78"] Oct 08 19:36:31 crc kubenswrapper[4750]: I1008 19:36:31.282949 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-csh78"] Oct 08 19:36:32 crc kubenswrapper[4750]: I1008 19:36:32.743542 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0676fec7-c711-4003-a560-c6ccef095934" path="/var/lib/kubelet/pods/0676fec7-c711-4003-a560-c6ccef095934/volumes" Oct 08 19:36:38 crc kubenswrapper[4750]: I1008 19:36:38.734407 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:36:38 crc kubenswrapper[4750]: E1008 19:36:38.735195 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:36:49 crc kubenswrapper[4750]: I1008 19:36:49.735209 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:36:49 crc kubenswrapper[4750]: E1008 19:36:49.736254 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:37:02 crc kubenswrapper[4750]: I1008 19:37:02.735297 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:37:02 crc kubenswrapper[4750]: E1008 19:37:02.736276 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:37:16 crc kubenswrapper[4750]: I1008 19:37:16.952188 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9tvjp"] Oct 08 19:37:16 crc kubenswrapper[4750]: E1008 19:37:16.953287 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0676fec7-c711-4003-a560-c6ccef095934" containerName="extract-utilities" Oct 08 19:37:16 crc kubenswrapper[4750]: I1008 19:37:16.953304 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0676fec7-c711-4003-a560-c6ccef095934" containerName="extract-utilities" Oct 08 19:37:16 crc kubenswrapper[4750]: E1008 19:37:16.953319 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0676fec7-c711-4003-a560-c6ccef095934" containerName="registry-server" Oct 08 19:37:16 crc kubenswrapper[4750]: I1008 19:37:16.953327 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0676fec7-c711-4003-a560-c6ccef095934" containerName="registry-server" Oct 08 19:37:16 crc kubenswrapper[4750]: E1008 19:37:16.953350 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0676fec7-c711-4003-a560-c6ccef095934" containerName="extract-content" Oct 08 19:37:16 crc kubenswrapper[4750]: I1008 19:37:16.953357 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0676fec7-c711-4003-a560-c6ccef095934" containerName="extract-content" Oct 08 19:37:16 crc kubenswrapper[4750]: I1008 19:37:16.953600 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="0676fec7-c711-4003-a560-c6ccef095934" containerName="registry-server" Oct 08 19:37:16 crc kubenswrapper[4750]: I1008 19:37:16.955182 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:16 crc kubenswrapper[4750]: I1008 19:37:16.963305 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tvjp"] Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.109485 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-utilities\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.109576 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-catalog-content\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.109756 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddkg9\" (UniqueName: \"kubernetes.io/projected/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-kube-api-access-ddkg9\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.211673 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddkg9\" (UniqueName: \"kubernetes.io/projected/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-kube-api-access-ddkg9\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.211737 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-utilities\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.211774 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-catalog-content\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.212722 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-utilities\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.212805 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-catalog-content\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.236681 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddkg9\" (UniqueName: \"kubernetes.io/projected/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-kube-api-access-ddkg9\") pod \"community-operators-9tvjp\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.294890 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.735105 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:37:17 crc kubenswrapper[4750]: E1008 19:37:17.735745 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:37:17 crc kubenswrapper[4750]: I1008 19:37:17.858990 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tvjp"] Oct 08 19:37:18 crc kubenswrapper[4750]: I1008 19:37:18.829305 4750 generic.go:334] "Generic (PLEG): container finished" podID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerID="a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840" exitCode=0 Oct 08 19:37:18 crc kubenswrapper[4750]: I1008 19:37:18.829380 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tvjp" event={"ID":"080c4cad-febe-4c3a-9e6e-bfa2955ab21c","Type":"ContainerDied","Data":"a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840"} Oct 08 19:37:18 crc kubenswrapper[4750]: I1008 19:37:18.829426 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tvjp" event={"ID":"080c4cad-febe-4c3a-9e6e-bfa2955ab21c","Type":"ContainerStarted","Data":"8b4190cff8f54ba15672d2ea1580ba91f5ef4e2b124e72480d661c5a164556ec"} Oct 08 19:37:18 crc kubenswrapper[4750]: I1008 19:37:18.832998 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:37:20 crc kubenswrapper[4750]: I1008 19:37:20.852069 4750 generic.go:334] "Generic (PLEG): container finished" podID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerID="9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086" exitCode=0 Oct 08 19:37:20 crc kubenswrapper[4750]: I1008 19:37:20.852178 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tvjp" event={"ID":"080c4cad-febe-4c3a-9e6e-bfa2955ab21c","Type":"ContainerDied","Data":"9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086"} Oct 08 19:37:21 crc kubenswrapper[4750]: I1008 19:37:21.870275 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tvjp" event={"ID":"080c4cad-febe-4c3a-9e6e-bfa2955ab21c","Type":"ContainerStarted","Data":"81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e"} Oct 08 19:37:21 crc kubenswrapper[4750]: I1008 19:37:21.924686 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9tvjp" podStartSLOduration=3.438069325 podStartE2EDuration="5.924657s" podCreationTimestamp="2025-10-08 19:37:16 +0000 UTC" firstStartedPulling="2025-10-08 19:37:18.832578868 +0000 UTC m=+5194.745549911" lastFinishedPulling="2025-10-08 19:37:21.319166573 +0000 UTC m=+5197.232137586" observedRunningTime="2025-10-08 19:37:21.920593468 +0000 UTC m=+5197.833564511" watchObservedRunningTime="2025-10-08 19:37:21.924657 +0000 UTC m=+5197.837628013" Oct 08 19:37:27 crc kubenswrapper[4750]: I1008 19:37:27.295828 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:27 crc kubenswrapper[4750]: I1008 19:37:27.296638 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:27 crc kubenswrapper[4750]: I1008 19:37:27.367085 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:27 crc kubenswrapper[4750]: I1008 19:37:27.978847 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:28 crc kubenswrapper[4750]: I1008 19:37:28.039115 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tvjp"] Oct 08 19:37:29 crc kubenswrapper[4750]: I1008 19:37:29.944510 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9tvjp" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerName="registry-server" containerID="cri-o://81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e" gracePeriod=2 Oct 08 19:37:30 crc kubenswrapper[4750]: I1008 19:37:30.739225 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:37:30 crc kubenswrapper[4750]: E1008 19:37:30.746810 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:37:30 crc kubenswrapper[4750]: I1008 19:37:30.919199 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:30 crc kubenswrapper[4750]: I1008 19:37:30.966911 4750 generic.go:334] "Generic (PLEG): container finished" podID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerID="81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e" exitCode=0 Oct 08 19:37:30 crc kubenswrapper[4750]: I1008 19:37:30.968170 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tvjp" Oct 08 19:37:30 crc kubenswrapper[4750]: I1008 19:37:30.968769 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tvjp" event={"ID":"080c4cad-febe-4c3a-9e6e-bfa2955ab21c","Type":"ContainerDied","Data":"81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e"} Oct 08 19:37:30 crc kubenswrapper[4750]: I1008 19:37:30.968817 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tvjp" event={"ID":"080c4cad-febe-4c3a-9e6e-bfa2955ab21c","Type":"ContainerDied","Data":"8b4190cff8f54ba15672d2ea1580ba91f5ef4e2b124e72480d661c5a164556ec"} Oct 08 19:37:30 crc kubenswrapper[4750]: I1008 19:37:30.968847 4750 scope.go:117] "RemoveContainer" containerID="81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.007006 4750 scope.go:117] "RemoveContainer" containerID="9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.037817 4750 scope.go:117] "RemoveContainer" containerID="a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.058352 4750 scope.go:117] "RemoveContainer" containerID="81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e" Oct 08 19:37:31 crc kubenswrapper[4750]: E1008 19:37:31.058954 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e\": container with ID starting with 81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e not found: ID does not exist" containerID="81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.058992 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e"} err="failed to get container status \"81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e\": rpc error: code = NotFound desc = could not find container \"81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e\": container with ID starting with 81e69ba4c0ac728e6f49c7314ee08685de2d7415480c78f6309bde5d39bf534e not found: ID does not exist" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.059018 4750 scope.go:117] "RemoveContainer" containerID="9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086" Oct 08 19:37:31 crc kubenswrapper[4750]: E1008 19:37:31.059616 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086\": container with ID starting with 9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086 not found: ID does not exist" containerID="9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.059665 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086"} err="failed to get container status \"9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086\": rpc error: code = NotFound desc = could not find container \"9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086\": container with ID starting with 9a2c9381418609545f4ac3b53803e741f39cba379205e2bfbf9eecd4e2cf6086 not found: ID does not exist" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.059699 4750 scope.go:117] "RemoveContainer" containerID="a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840" Oct 08 19:37:31 crc kubenswrapper[4750]: E1008 19:37:31.059999 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840\": container with ID starting with a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840 not found: ID does not exist" containerID="a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.060033 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840"} err="failed to get container status \"a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840\": rpc error: code = NotFound desc = could not find container \"a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840\": container with ID starting with a15c918b9eefbab4e700d5f209699f4042545d46c81bb0f5e00fe403cf208840 not found: ID does not exist" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.068719 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddkg9\" (UniqueName: \"kubernetes.io/projected/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-kube-api-access-ddkg9\") pod \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.068912 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-utilities\") pod \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.068961 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-catalog-content\") pod \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\" (UID: \"080c4cad-febe-4c3a-9e6e-bfa2955ab21c\") " Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.070436 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-utilities" (OuterVolumeSpecName: "utilities") pod "080c4cad-febe-4c3a-9e6e-bfa2955ab21c" (UID: "080c4cad-febe-4c3a-9e6e-bfa2955ab21c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.075336 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-kube-api-access-ddkg9" (OuterVolumeSpecName: "kube-api-access-ddkg9") pod "080c4cad-febe-4c3a-9e6e-bfa2955ab21c" (UID: "080c4cad-febe-4c3a-9e6e-bfa2955ab21c"). InnerVolumeSpecName "kube-api-access-ddkg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.138916 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "080c4cad-febe-4c3a-9e6e-bfa2955ab21c" (UID: "080c4cad-febe-4c3a-9e6e-bfa2955ab21c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.170657 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.170696 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.170709 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddkg9\" (UniqueName: \"kubernetes.io/projected/080c4cad-febe-4c3a-9e6e-bfa2955ab21c-kube-api-access-ddkg9\") on node \"crc\" DevicePath \"\"" Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.313653 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tvjp"] Oct 08 19:37:31 crc kubenswrapper[4750]: I1008 19:37:31.317401 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9tvjp"] Oct 08 19:37:32 crc kubenswrapper[4750]: I1008 19:37:32.751833 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" path="/var/lib/kubelet/pods/080c4cad-febe-4c3a-9e6e-bfa2955ab21c/volumes" Oct 08 19:37:42 crc kubenswrapper[4750]: I1008 19:37:42.735659 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:37:42 crc kubenswrapper[4750]: E1008 19:37:42.737341 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.142839 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 08 19:37:54 crc kubenswrapper[4750]: E1008 19:37:54.144216 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerName="registry-server" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.144239 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerName="registry-server" Oct 08 19:37:54 crc kubenswrapper[4750]: E1008 19:37:54.144262 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerName="extract-content" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.144272 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerName="extract-content" Oct 08 19:37:54 crc kubenswrapper[4750]: E1008 19:37:54.144294 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerName="extract-utilities" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.144303 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerName="extract-utilities" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.144522 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="080c4cad-febe-4c3a-9e6e-bfa2955ab21c" containerName="registry-server" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.145356 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.149773 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5p88g" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.153858 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.213099 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0ee0f5f6-2247-4da4-97ae-28ee21aceb5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ee0f5f6-2247-4da4-97ae-28ee21aceb5e\") pod \"mariadb-copy-data\" (UID: \"20764f96-f8a9-499f-9341-941096cf77ce\") " pod="openstack/mariadb-copy-data" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.213153 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5tq\" (UniqueName: \"kubernetes.io/projected/20764f96-f8a9-499f-9341-941096cf77ce-kube-api-access-gx5tq\") pod \"mariadb-copy-data\" (UID: \"20764f96-f8a9-499f-9341-941096cf77ce\") " pod="openstack/mariadb-copy-data" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.315228 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0ee0f5f6-2247-4da4-97ae-28ee21aceb5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ee0f5f6-2247-4da4-97ae-28ee21aceb5e\") pod \"mariadb-copy-data\" (UID: \"20764f96-f8a9-499f-9341-941096cf77ce\") " pod="openstack/mariadb-copy-data" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.315326 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5tq\" (UniqueName: \"kubernetes.io/projected/20764f96-f8a9-499f-9341-941096cf77ce-kube-api-access-gx5tq\") pod \"mariadb-copy-data\" (UID: \"20764f96-f8a9-499f-9341-941096cf77ce\") " pod="openstack/mariadb-copy-data" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.319487 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.319540 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0ee0f5f6-2247-4da4-97ae-28ee21aceb5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ee0f5f6-2247-4da4-97ae-28ee21aceb5e\") pod \"mariadb-copy-data\" (UID: \"20764f96-f8a9-499f-9341-941096cf77ce\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4baaa0ad1d4d4dac7f26e43c509ad6388521970a836746f1563864926628508a/globalmount\"" pod="openstack/mariadb-copy-data" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.348590 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5tq\" (UniqueName: \"kubernetes.io/projected/20764f96-f8a9-499f-9341-941096cf77ce-kube-api-access-gx5tq\") pod \"mariadb-copy-data\" (UID: \"20764f96-f8a9-499f-9341-941096cf77ce\") " pod="openstack/mariadb-copy-data" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.388130 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0ee0f5f6-2247-4da4-97ae-28ee21aceb5e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ee0f5f6-2247-4da4-97ae-28ee21aceb5e\") pod \"mariadb-copy-data\" (UID: \"20764f96-f8a9-499f-9341-941096cf77ce\") " pod="openstack/mariadb-copy-data" Oct 08 19:37:54 crc kubenswrapper[4750]: I1008 19:37:54.480569 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 08 19:37:55 crc kubenswrapper[4750]: I1008 19:37:55.026436 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 08 19:37:55 crc kubenswrapper[4750]: W1008 19:37:55.027435 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20764f96_f8a9_499f_9341_941096cf77ce.slice/crio-dc9b85ff924250c98b57523c7be37043bffc8338725cc51081272a38c33b81ab WatchSource:0}: Error finding container dc9b85ff924250c98b57523c7be37043bffc8338725cc51081272a38c33b81ab: Status 404 returned error can't find the container with id dc9b85ff924250c98b57523c7be37043bffc8338725cc51081272a38c33b81ab Oct 08 19:37:55 crc kubenswrapper[4750]: I1008 19:37:55.258337 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"20764f96-f8a9-499f-9341-941096cf77ce","Type":"ContainerStarted","Data":"7fabd745f1245f743f58884167041d134c2dc6793911dcb872cd7a5325b7264d"} Oct 08 19:37:55 crc kubenswrapper[4750]: I1008 19:37:55.258845 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"20764f96-f8a9-499f-9341-941096cf77ce","Type":"ContainerStarted","Data":"dc9b85ff924250c98b57523c7be37043bffc8338725cc51081272a38c33b81ab"} Oct 08 19:37:55 crc kubenswrapper[4750]: I1008 19:37:55.284465 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.284415097 podStartE2EDuration="2.284415097s" podCreationTimestamp="2025-10-08 19:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:37:55.277893993 +0000 UTC m=+5231.190865026" watchObservedRunningTime="2025-10-08 19:37:55.284415097 +0000 UTC m=+5231.197386160" Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.122242 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.124264 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.130665 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.165827 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gdxx\" (UniqueName: \"kubernetes.io/projected/13000279-d44c-4fdd-b436-77a860f7e775-kube-api-access-5gdxx\") pod \"mariadb-client\" (UID: \"13000279-d44c-4fdd-b436-77a860f7e775\") " pod="openstack/mariadb-client" Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.266930 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gdxx\" (UniqueName: \"kubernetes.io/projected/13000279-d44c-4fdd-b436-77a860f7e775-kube-api-access-5gdxx\") pod \"mariadb-client\" (UID: \"13000279-d44c-4fdd-b436-77a860f7e775\") " pod="openstack/mariadb-client" Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.297847 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gdxx\" (UniqueName: \"kubernetes.io/projected/13000279-d44c-4fdd-b436-77a860f7e775-kube-api-access-5gdxx\") pod \"mariadb-client\" (UID: \"13000279-d44c-4fdd-b436-77a860f7e775\") " pod="openstack/mariadb-client" Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.461701 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.729616 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 08 19:37:57 crc kubenswrapper[4750]: I1008 19:37:57.734034 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:37:57 crc kubenswrapper[4750]: E1008 19:37:57.734297 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:37:58 crc kubenswrapper[4750]: I1008 19:37:58.296542 4750 generic.go:334] "Generic (PLEG): container finished" podID="13000279-d44c-4fdd-b436-77a860f7e775" containerID="e43d5f1ca7af778e5e2dc05bfebf82183bce50911e4a8418399894f93c50aaf5" exitCode=0 Oct 08 19:37:58 crc kubenswrapper[4750]: I1008 19:37:58.296656 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"13000279-d44c-4fdd-b436-77a860f7e775","Type":"ContainerDied","Data":"e43d5f1ca7af778e5e2dc05bfebf82183bce50911e4a8418399894f93c50aaf5"} Oct 08 19:37:58 crc kubenswrapper[4750]: I1008 19:37:58.297021 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"13000279-d44c-4fdd-b436-77a860f7e775","Type":"ContainerStarted","Data":"d96164c099ec53b3202b06edf72f86c5d9f1c289701121e98db65899ada1b083"} Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.618485 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.652793 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_13000279-d44c-4fdd-b436-77a860f7e775/mariadb-client/0.log" Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.686261 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.693717 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.814613 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gdxx\" (UniqueName: \"kubernetes.io/projected/13000279-d44c-4fdd-b436-77a860f7e775-kube-api-access-5gdxx\") pod \"13000279-d44c-4fdd-b436-77a860f7e775\" (UID: \"13000279-d44c-4fdd-b436-77a860f7e775\") " Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.823471 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13000279-d44c-4fdd-b436-77a860f7e775-kube-api-access-5gdxx" (OuterVolumeSpecName: "kube-api-access-5gdxx") pod "13000279-d44c-4fdd-b436-77a860f7e775" (UID: "13000279-d44c-4fdd-b436-77a860f7e775"). InnerVolumeSpecName "kube-api-access-5gdxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.852646 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 08 19:37:59 crc kubenswrapper[4750]: E1008 19:37:59.853169 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13000279-d44c-4fdd-b436-77a860f7e775" containerName="mariadb-client" Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.853197 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="13000279-d44c-4fdd-b436-77a860f7e775" containerName="mariadb-client" Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.853380 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="13000279-d44c-4fdd-b436-77a860f7e775" containerName="mariadb-client" Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.854090 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.865876 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 08 19:37:59 crc kubenswrapper[4750]: I1008 19:37:59.917306 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gdxx\" (UniqueName: \"kubernetes.io/projected/13000279-d44c-4fdd-b436-77a860f7e775-kube-api-access-5gdxx\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.019729 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8m8m\" (UniqueName: \"kubernetes.io/projected/61d0e000-a9f6-48ca-80fe-d3135666b3ec-kube-api-access-c8m8m\") pod \"mariadb-client\" (UID: \"61d0e000-a9f6-48ca-80fe-d3135666b3ec\") " pod="openstack/mariadb-client" Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.121884 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8m8m\" (UniqueName: \"kubernetes.io/projected/61d0e000-a9f6-48ca-80fe-d3135666b3ec-kube-api-access-c8m8m\") pod \"mariadb-client\" (UID: \"61d0e000-a9f6-48ca-80fe-d3135666b3ec\") " pod="openstack/mariadb-client" Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.143520 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8m8m\" (UniqueName: \"kubernetes.io/projected/61d0e000-a9f6-48ca-80fe-d3135666b3ec-kube-api-access-c8m8m\") pod \"mariadb-client\" (UID: \"61d0e000-a9f6-48ca-80fe-d3135666b3ec\") " pod="openstack/mariadb-client" Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.182648 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.320788 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d96164c099ec53b3202b06edf72f86c5d9f1c289701121e98db65899ada1b083" Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.320883 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.344023 4750 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="13000279-d44c-4fdd-b436-77a860f7e775" podUID="61d0e000-a9f6-48ca-80fe-d3135666b3ec" Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.387453 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 08 19:38:00 crc kubenswrapper[4750]: W1008 19:38:00.392965 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61d0e000_a9f6_48ca_80fe_d3135666b3ec.slice/crio-5828bfe14d42800363b9913960a595287a6f7d0f5390d426f2eadcea5c04c5e4 WatchSource:0}: Error finding container 5828bfe14d42800363b9913960a595287a6f7d0f5390d426f2eadcea5c04c5e4: Status 404 returned error can't find the container with id 5828bfe14d42800363b9913960a595287a6f7d0f5390d426f2eadcea5c04c5e4 Oct 08 19:38:00 crc kubenswrapper[4750]: I1008 19:38:00.745721 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13000279-d44c-4fdd-b436-77a860f7e775" path="/var/lib/kubelet/pods/13000279-d44c-4fdd-b436-77a860f7e775/volumes" Oct 08 19:38:01 crc kubenswrapper[4750]: I1008 19:38:01.342051 4750 generic.go:334] "Generic (PLEG): container finished" podID="61d0e000-a9f6-48ca-80fe-d3135666b3ec" containerID="5b4ea5eeddf85fce23d62c88367bd935fdfe85e9f6f36fce3510e6002e668923" exitCode=0 Oct 08 19:38:01 crc kubenswrapper[4750]: I1008 19:38:01.344252 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"61d0e000-a9f6-48ca-80fe-d3135666b3ec","Type":"ContainerDied","Data":"5b4ea5eeddf85fce23d62c88367bd935fdfe85e9f6f36fce3510e6002e668923"} Oct 08 19:38:01 crc kubenswrapper[4750]: I1008 19:38:01.344304 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"61d0e000-a9f6-48ca-80fe-d3135666b3ec","Type":"ContainerStarted","Data":"5828bfe14d42800363b9913960a595287a6f7d0f5390d426f2eadcea5c04c5e4"} Oct 08 19:38:02 crc kubenswrapper[4750]: I1008 19:38:02.750592 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 19:38:02 crc kubenswrapper[4750]: I1008 19:38:02.772263 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_61d0e000-a9f6-48ca-80fe-d3135666b3ec/mariadb-client/0.log" Oct 08 19:38:02 crc kubenswrapper[4750]: I1008 19:38:02.803390 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 08 19:38:02 crc kubenswrapper[4750]: I1008 19:38:02.809103 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 08 19:38:02 crc kubenswrapper[4750]: I1008 19:38:02.882800 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8m8m\" (UniqueName: \"kubernetes.io/projected/61d0e000-a9f6-48ca-80fe-d3135666b3ec-kube-api-access-c8m8m\") pod \"61d0e000-a9f6-48ca-80fe-d3135666b3ec\" (UID: \"61d0e000-a9f6-48ca-80fe-d3135666b3ec\") " Oct 08 19:38:02 crc kubenswrapper[4750]: I1008 19:38:02.893538 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61d0e000-a9f6-48ca-80fe-d3135666b3ec-kube-api-access-c8m8m" (OuterVolumeSpecName: "kube-api-access-c8m8m") pod "61d0e000-a9f6-48ca-80fe-d3135666b3ec" (UID: "61d0e000-a9f6-48ca-80fe-d3135666b3ec"). InnerVolumeSpecName "kube-api-access-c8m8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:38:02 crc kubenswrapper[4750]: I1008 19:38:02.985278 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8m8m\" (UniqueName: \"kubernetes.io/projected/61d0e000-a9f6-48ca-80fe-d3135666b3ec-kube-api-access-c8m8m\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:03 crc kubenswrapper[4750]: I1008 19:38:03.364653 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5828bfe14d42800363b9913960a595287a6f7d0f5390d426f2eadcea5c04c5e4" Oct 08 19:38:03 crc kubenswrapper[4750]: I1008 19:38:03.365014 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 08 19:38:04 crc kubenswrapper[4750]: I1008 19:38:04.748989 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d0e000-a9f6-48ca-80fe-d3135666b3ec" path="/var/lib/kubelet/pods/61d0e000-a9f6-48ca-80fe-d3135666b3ec/volumes" Oct 08 19:38:11 crc kubenswrapper[4750]: I1008 19:38:11.734340 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:38:11 crc kubenswrapper[4750]: E1008 19:38:11.735758 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:38:25 crc kubenswrapper[4750]: I1008 19:38:25.734323 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:38:25 crc kubenswrapper[4750]: E1008 19:38:25.736021 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.582390 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 19:38:36 crc kubenswrapper[4750]: E1008 19:38:36.585732 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d0e000-a9f6-48ca-80fe-d3135666b3ec" containerName="mariadb-client" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.585772 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d0e000-a9f6-48ca-80fe-d3135666b3ec" containerName="mariadb-client" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.586107 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d0e000-a9f6-48ca-80fe-d3135666b3ec" containerName="mariadb-client" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.587735 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.595265 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.595458 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.595670 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nzhzl" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.600136 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.619582 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.621982 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.631147 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.632969 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.651804 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.663300 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.702711 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/595b9fc0-5a3e-4761-beaa-91924ecf4f54-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.702786 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fa147cac-5ae0-481b-b0ba-21e5b93164dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa147cac-5ae0-481b-b0ba-21e5b93164dc\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.703247 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/595b9fc0-5a3e-4761-beaa-91924ecf4f54-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.703349 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf79z\" (UniqueName: \"kubernetes.io/projected/595b9fc0-5a3e-4761-beaa-91924ecf4f54-kube-api-access-kf79z\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.703395 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/595b9fc0-5a3e-4761-beaa-91924ecf4f54-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.703425 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595b9fc0-5a3e-4761-beaa-91924ecf4f54-config\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.807893 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28356cd5-1bba-4d6c-8d73-c5650f3daa80-config\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808064 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28356cd5-1bba-4d6c-8d73-c5650f3daa80-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808184 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/595b9fc0-5a3e-4761-beaa-91924ecf4f54-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808221 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf79z\" (UniqueName: \"kubernetes.io/projected/595b9fc0-5a3e-4761-beaa-91924ecf4f54-kube-api-access-kf79z\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808304 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/595b9fc0-5a3e-4761-beaa-91924ecf4f54-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808361 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28356cd5-1bba-4d6c-8d73-c5650f3daa80-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808445 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595b9fc0-5a3e-4761-beaa-91924ecf4f54-config\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808511 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aa84d3-2d58-482f-8a8c-fe18543714de-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808588 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d899h\" (UniqueName: \"kubernetes.io/projected/d1aa84d3-2d58-482f-8a8c-fe18543714de-kube-api-access-d899h\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.808621 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1aa84d3-2d58-482f-8a8c-fe18543714de-config\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.809079 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/595b9fc0-5a3e-4761-beaa-91924ecf4f54-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.809428 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/595b9fc0-5a3e-4761-beaa-91924ecf4f54-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.809614 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-791df69b-8fb6-4469-aeb6-4d303ac8b8b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791df69b-8fb6-4469-aeb6-4d303ac8b8b6\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.809919 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fa147cac-5ae0-481b-b0ba-21e5b93164dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa147cac-5ae0-481b-b0ba-21e5b93164dc\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.809996 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4a212b96-3bc1-4a4f-8362-71bf26952fce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a212b96-3bc1-4a4f-8362-71bf26952fce\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.810059 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq7bf\" (UniqueName: \"kubernetes.io/projected/28356cd5-1bba-4d6c-8d73-c5650f3daa80-kube-api-access-tq7bf\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.810084 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1aa84d3-2d58-482f-8a8c-fe18543714de-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.810117 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1aa84d3-2d58-482f-8a8c-fe18543714de-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.810363 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28356cd5-1bba-4d6c-8d73-c5650f3daa80-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.811076 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595b9fc0-5a3e-4761-beaa-91924ecf4f54-config\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.811417 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/595b9fc0-5a3e-4761-beaa-91924ecf4f54-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.829766 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/595b9fc0-5a3e-4761-beaa-91924ecf4f54-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.830688 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.830775 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fa147cac-5ae0-481b-b0ba-21e5b93164dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa147cac-5ae0-481b-b0ba-21e5b93164dc\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/39adb5a3bacd1b2bcb0151898f2544cf758da9f724735108cd9ab338c8e1a056/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.832981 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.837343 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.843041 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.843389 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.843944 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-f77nh" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.859011 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf79z\" (UniqueName: \"kubernetes.io/projected/595b9fc0-5a3e-4761-beaa-91924ecf4f54-kube-api-access-kf79z\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.865484 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.867624 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.900848 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.904784 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.911171 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fa147cac-5ae0-481b-b0ba-21e5b93164dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fa147cac-5ae0-481b-b0ba-21e5b93164dc\") pod \"ovsdbserver-sb-0\" (UID: \"595b9fc0-5a3e-4761-beaa-91924ecf4f54\") " pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.911892 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912633 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28356cd5-1bba-4d6c-8d73-c5650f3daa80-config\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912690 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28356cd5-1bba-4d6c-8d73-c5650f3daa80-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912771 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95eb24ea-3c42-4690-9623-99af22f79703-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912798 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28356cd5-1bba-4d6c-8d73-c5650f3daa80-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912828 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-55e6b89f-fec7-4c72-9a3f-9113e0434b88\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-55e6b89f-fec7-4c72-9a3f-9113e0434b88\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912857 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95eb24ea-3c42-4690-9623-99af22f79703-config\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912885 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aa84d3-2d58-482f-8a8c-fe18543714de-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912913 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwpw6\" (UniqueName: \"kubernetes.io/projected/95eb24ea-3c42-4690-9623-99af22f79703-kube-api-access-dwpw6\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912941 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d899h\" (UniqueName: \"kubernetes.io/projected/d1aa84d3-2d58-482f-8a8c-fe18543714de-kube-api-access-d899h\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.912970 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1aa84d3-2d58-482f-8a8c-fe18543714de-config\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913002 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95eb24ea-3c42-4690-9623-99af22f79703-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913046 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-791df69b-8fb6-4469-aeb6-4d303ac8b8b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791df69b-8fb6-4469-aeb6-4d303ac8b8b6\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913240 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95eb24ea-3c42-4690-9623-99af22f79703-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913325 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4a212b96-3bc1-4a4f-8362-71bf26952fce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a212b96-3bc1-4a4f-8362-71bf26952fce\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913434 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq7bf\" (UniqueName: \"kubernetes.io/projected/28356cd5-1bba-4d6c-8d73-c5650f3daa80-kube-api-access-tq7bf\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913477 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1aa84d3-2d58-482f-8a8c-fe18543714de-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913521 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1aa84d3-2d58-482f-8a8c-fe18543714de-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913535 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28356cd5-1bba-4d6c-8d73-c5650f3daa80-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.913832 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28356cd5-1bba-4d6c-8d73-c5650f3daa80-config\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.914005 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28356cd5-1bba-4d6c-8d73-c5650f3daa80-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.914332 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1aa84d3-2d58-482f-8a8c-fe18543714de-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.914450 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28356cd5-1bba-4d6c-8d73-c5650f3daa80-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.914974 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1aa84d3-2d58-482f-8a8c-fe18543714de-config\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.915174 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1aa84d3-2d58-482f-8a8c-fe18543714de-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.920482 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.931387 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.932956 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.933018 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4a212b96-3bc1-4a4f-8362-71bf26952fce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a212b96-3bc1-4a4f-8362-71bf26952fce\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca6a6a2aa1b39580a0e876b925c5e6c09bc01dd17f7f082d81ac8da5c74468f5/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.933569 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.933619 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-791df69b-8fb6-4469-aeb6-4d303ac8b8b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791df69b-8fb6-4469-aeb6-4d303ac8b8b6\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0eba9a3ca370079cd92c8d516ab657b1314a07389d12cbf990e73dc90ec42dbf/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.933572 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.939798 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aa84d3-2d58-482f-8a8c-fe18543714de-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.940359 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq7bf\" (UniqueName: \"kubernetes.io/projected/28356cd5-1bba-4d6c-8d73-c5650f3daa80-kube-api-access-tq7bf\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.940978 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d899h\" (UniqueName: \"kubernetes.io/projected/d1aa84d3-2d58-482f-8a8c-fe18543714de-kube-api-access-d899h\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.942354 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28356cd5-1bba-4d6c-8d73-c5650f3daa80-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.972377 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4a212b96-3bc1-4a4f-8362-71bf26952fce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a212b96-3bc1-4a4f-8362-71bf26952fce\") pod \"ovsdbserver-sb-1\" (UID: \"d1aa84d3-2d58-482f-8a8c-fe18543714de\") " pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.977753 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-791df69b-8fb6-4469-aeb6-4d303ac8b8b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791df69b-8fb6-4469-aeb6-4d303ac8b8b6\") pod \"ovsdbserver-sb-2\" (UID: \"28356cd5-1bba-4d6c-8d73-c5650f3daa80\") " pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:36 crc kubenswrapper[4750]: I1008 19:38:36.981253 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.014932 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/714778b9-d1d0-4767-a601-5bc178cd3199-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.014990 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d468db2-b592-4bf9-8cb4-d4fbad07292b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015021 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95eb24ea-3c42-4690-9623-99af22f79703-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015066 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d468db2-b592-4bf9-8cb4-d4fbad07292b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015091 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/714778b9-d1d0-4767-a601-5bc178cd3199-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015115 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5fn\" (UniqueName: \"kubernetes.io/projected/4d468db2-b592-4bf9-8cb4-d4fbad07292b-kube-api-access-lt5fn\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015141 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-324508d8-a2a0-48fb-b75b-d9b4608c0318\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-324508d8-a2a0-48fb-b75b-d9b4608c0318\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015175 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcqxr\" (UniqueName: \"kubernetes.io/projected/714778b9-d1d0-4767-a601-5bc178cd3199-kube-api-access-tcqxr\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015199 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d468db2-b592-4bf9-8cb4-d4fbad07292b-config\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015254 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95eb24ea-3c42-4690-9623-99af22f79703-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015279 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6b505780-ce66-4ac3-8c6c-fc4741096506\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b505780-ce66-4ac3-8c6c-fc4741096506\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015306 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-55e6b89f-fec7-4c72-9a3f-9113e0434b88\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-55e6b89f-fec7-4c72-9a3f-9113e0434b88\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015325 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95eb24ea-3c42-4690-9623-99af22f79703-config\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015348 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/714778b9-d1d0-4767-a601-5bc178cd3199-config\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015374 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d468db2-b592-4bf9-8cb4-d4fbad07292b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015397 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwpw6\" (UniqueName: \"kubernetes.io/projected/95eb24ea-3c42-4690-9623-99af22f79703-kube-api-access-dwpw6\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015435 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714778b9-d1d0-4767-a601-5bc178cd3199-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.015459 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95eb24ea-3c42-4690-9623-99af22f79703-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.016155 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95eb24ea-3c42-4690-9623-99af22f79703-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.017151 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95eb24ea-3c42-4690-9623-99af22f79703-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.018749 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95eb24ea-3c42-4690-9623-99af22f79703-config\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.020359 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.020394 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-55e6b89f-fec7-4c72-9a3f-9113e0434b88\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-55e6b89f-fec7-4c72-9a3f-9113e0434b88\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/823ec4679322dab7a1a52d981b9625508dae8aef2912a328164519f7001e7697/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.046692 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwpw6\" (UniqueName: \"kubernetes.io/projected/95eb24ea-3c42-4690-9623-99af22f79703-kube-api-access-dwpw6\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.050511 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95eb24ea-3c42-4690-9623-99af22f79703-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.056613 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-55e6b89f-fec7-4c72-9a3f-9113e0434b88\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-55e6b89f-fec7-4c72-9a3f-9113e0434b88\") pod \"ovsdbserver-nb-0\" (UID: \"95eb24ea-3c42-4690-9623-99af22f79703\") " pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117705 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/714778b9-d1d0-4767-a601-5bc178cd3199-config\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117768 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d468db2-b592-4bf9-8cb4-d4fbad07292b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117819 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714778b9-d1d0-4767-a601-5bc178cd3199-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117855 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/714778b9-d1d0-4767-a601-5bc178cd3199-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117878 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d468db2-b592-4bf9-8cb4-d4fbad07292b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117923 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d468db2-b592-4bf9-8cb4-d4fbad07292b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117945 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/714778b9-d1d0-4767-a601-5bc178cd3199-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117964 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5fn\" (UniqueName: \"kubernetes.io/projected/4d468db2-b592-4bf9-8cb4-d4fbad07292b-kube-api-access-lt5fn\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.117988 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-324508d8-a2a0-48fb-b75b-d9b4608c0318\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-324508d8-a2a0-48fb-b75b-d9b4608c0318\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.118021 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcqxr\" (UniqueName: \"kubernetes.io/projected/714778b9-d1d0-4767-a601-5bc178cd3199-kube-api-access-tcqxr\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.118109 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d468db2-b592-4bf9-8cb4-d4fbad07292b-config\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.118354 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6b505780-ce66-4ac3-8c6c-fc4741096506\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b505780-ce66-4ac3-8c6c-fc4741096506\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.118981 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/714778b9-d1d0-4767-a601-5bc178cd3199-config\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.119804 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d468db2-b592-4bf9-8cb4-d4fbad07292b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.120253 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/714778b9-d1d0-4767-a601-5bc178cd3199-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.120943 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d468db2-b592-4bf9-8cb4-d4fbad07292b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.121057 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d468db2-b592-4bf9-8cb4-d4fbad07292b-config\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.121383 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/714778b9-d1d0-4767-a601-5bc178cd3199-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.125480 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.125507 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-324508d8-a2a0-48fb-b75b-d9b4608c0318\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-324508d8-a2a0-48fb-b75b-d9b4608c0318\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/92e6567ffa48d6dfb181069aa7dd26fa65367549a935d8c18b53f6fe5e5c78b0/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.125992 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d468db2-b592-4bf9-8cb4-d4fbad07292b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.126144 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.126164 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6b505780-ce66-4ac3-8c6c-fc4741096506\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b505780-ce66-4ac3-8c6c-fc4741096506\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b4d99befd5a944376886eea357ee6e0306aae64a326913cb30c1bcd748f87918/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.126605 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714778b9-d1d0-4767-a601-5bc178cd3199-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.136599 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcqxr\" (UniqueName: \"kubernetes.io/projected/714778b9-d1d0-4767-a601-5bc178cd3199-kube-api-access-tcqxr\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.138814 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5fn\" (UniqueName: \"kubernetes.io/projected/4d468db2-b592-4bf9-8cb4-d4fbad07292b-kube-api-access-lt5fn\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.161658 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6b505780-ce66-4ac3-8c6c-fc4741096506\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b505780-ce66-4ac3-8c6c-fc4741096506\") pod \"ovsdbserver-nb-1\" (UID: \"4d468db2-b592-4bf9-8cb4-d4fbad07292b\") " pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.165122 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-324508d8-a2a0-48fb-b75b-d9b4608c0318\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-324508d8-a2a0-48fb-b75b-d9b4608c0318\") pod \"ovsdbserver-nb-2\" (UID: \"714778b9-d1d0-4767-a601-5bc178cd3199\") " pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.262946 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.331573 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.347158 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.357789 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.514660 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.606472 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 08 19:38:37 crc kubenswrapper[4750]: W1008 19:38:37.617110 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1aa84d3_2d58_482f_8a8c_fe18543714de.slice/crio-138c6e144738cc95090706be8a5bd8b8cb7e4fb4dc8d13e9f67c6c91fbf5cbbf WatchSource:0}: Error finding container 138c6e144738cc95090706be8a5bd8b8cb7e4fb4dc8d13e9f67c6c91fbf5cbbf: Status 404 returned error can't find the container with id 138c6e144738cc95090706be8a5bd8b8cb7e4fb4dc8d13e9f67c6c91fbf5cbbf Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.706431 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d1aa84d3-2d58-482f-8a8c-fe18543714de","Type":"ContainerStarted","Data":"138c6e144738cc95090706be8a5bd8b8cb7e4fb4dc8d13e9f67c6c91fbf5cbbf"} Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.707678 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"595b9fc0-5a3e-4761-beaa-91924ecf4f54","Type":"ContainerStarted","Data":"f46547a4a2db5835e1b9449f021f416c41fe765c5fb311a7153bd54db58a4f00"} Oct 08 19:38:37 crc kubenswrapper[4750]: I1008 19:38:37.929860 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 08 19:38:37 crc kubenswrapper[4750]: W1008 19:38:37.953017 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28356cd5_1bba_4d6c_8d73_c5650f3daa80.slice/crio-6657acba3cc82c377bc482f649b69fb0e1f6463cfecaf0c1a1bafd2376d84418 WatchSource:0}: Error finding container 6657acba3cc82c377bc482f649b69fb0e1f6463cfecaf0c1a1bafd2376d84418: Status 404 returned error can't find the container with id 6657acba3cc82c377bc482f649b69fb0e1f6463cfecaf0c1a1bafd2376d84418 Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.057927 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 08 19:38:38 crc kubenswrapper[4750]: W1008 19:38:38.069612 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d468db2_b592_4bf9_8cb4_d4fbad07292b.slice/crio-c6644e13da245c8ff50eb3946a57d13813832b47ef03d948a5bb0068a5dce540 WatchSource:0}: Error finding container c6644e13da245c8ff50eb3946a57d13813832b47ef03d948a5bb0068a5dce540: Status 404 returned error can't find the container with id c6644e13da245c8ff50eb3946a57d13813832b47ef03d948a5bb0068a5dce540 Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.707188 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 08 19:38:38 crc kubenswrapper[4750]: W1008 19:38:38.714653 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod714778b9_d1d0_4767_a601_5bc178cd3199.slice/crio-b1659684d4da91038482eead67546c02ee57cfb51fbe23faf8f852afbb986576 WatchSource:0}: Error finding container b1659684d4da91038482eead67546c02ee57cfb51fbe23faf8f852afbb986576: Status 404 returned error can't find the container with id b1659684d4da91038482eead67546c02ee57cfb51fbe23faf8f852afbb986576 Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.721829 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"4d468db2-b592-4bf9-8cb4-d4fbad07292b","Type":"ContainerStarted","Data":"44b161270480e425b936598c61e88509748ca8e88e2786fa8c82d409ab64d46b"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.721942 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"4d468db2-b592-4bf9-8cb4-d4fbad07292b","Type":"ContainerStarted","Data":"e6e14d0fadd8b72bb92619617a3e96ebe40d1b3c88934ee1a5a569a6d39356a8"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.721964 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"4d468db2-b592-4bf9-8cb4-d4fbad07292b","Type":"ContainerStarted","Data":"c6644e13da245c8ff50eb3946a57d13813832b47ef03d948a5bb0068a5dce540"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.735484 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:38:38 crc kubenswrapper[4750]: E1008 19:38:38.736111 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.770102 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.770065492 podStartE2EDuration="3.770065492s" podCreationTimestamp="2025-10-08 19:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:38:38.748989877 +0000 UTC m=+5274.661960910" watchObservedRunningTime="2025-10-08 19:38:38.770065492 +0000 UTC m=+5274.683036515" Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.786816 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d1aa84d3-2d58-482f-8a8c-fe18543714de","Type":"ContainerStarted","Data":"42bb058b9bb1f2208d0cf63d25eec618272906dcffe7401db45d16c5ebc0bf92"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.786881 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d1aa84d3-2d58-482f-8a8c-fe18543714de","Type":"ContainerStarted","Data":"fd1659608ca619595a6224c6ffad50c56a9427e73453933e977cb7e2d20262a7"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.786896 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"28356cd5-1bba-4d6c-8d73-c5650f3daa80","Type":"ContainerStarted","Data":"7b54683431c99804d7c86af2a397c9d3e77149f9cc75753f3f4c0c665d1921ce"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.786920 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"28356cd5-1bba-4d6c-8d73-c5650f3daa80","Type":"ContainerStarted","Data":"8579ed47b32d4ba00872a6b0cfb1aff1eee37166817c96742226e9e6c4f8e111"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.786943 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"28356cd5-1bba-4d6c-8d73-c5650f3daa80","Type":"ContainerStarted","Data":"6657acba3cc82c377bc482f649b69fb0e1f6463cfecaf0c1a1bafd2376d84418"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.786957 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"595b9fc0-5a3e-4761-beaa-91924ecf4f54","Type":"ContainerStarted","Data":"d3c8205b58b77ed40a0a879119f1cdad34cde06d281ca059c4318d83934d4eb4"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.786971 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"595b9fc0-5a3e-4761-beaa-91924ecf4f54","Type":"ContainerStarted","Data":"e14b0e64cb88b80d5ea9d94eebb840a40cd32bc7f4a73b7a689058fa8b619186"} Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.802664 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.802635373 podStartE2EDuration="3.802635373s" podCreationTimestamp="2025-10-08 19:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:38:38.797779173 +0000 UTC m=+5274.710750256" watchObservedRunningTime="2025-10-08 19:38:38.802635373 +0000 UTC m=+5274.715606406" Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.805527 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.805515625 podStartE2EDuration="3.805515625s" podCreationTimestamp="2025-10-08 19:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:38:38.779810975 +0000 UTC m=+5274.692782008" watchObservedRunningTime="2025-10-08 19:38:38.805515625 +0000 UTC m=+5274.718486648" Oct 08 19:38:38 crc kubenswrapper[4750]: I1008 19:38:38.831699 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.831672867 podStartE2EDuration="3.831672867s" podCreationTimestamp="2025-10-08 19:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:38:38.823933284 +0000 UTC m=+5274.736904297" watchObservedRunningTime="2025-10-08 19:38:38.831672867 +0000 UTC m=+5274.744643880" Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.490526 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.784680 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"95eb24ea-3c42-4690-9623-99af22f79703","Type":"ContainerStarted","Data":"3ab02693956fbceffd7821cc1bb36f9521cfe57d0fa872135edff25733c32451"} Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.785174 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"95eb24ea-3c42-4690-9623-99af22f79703","Type":"ContainerStarted","Data":"e07481e4290d40bc5f06a34ede0583210bfdf33d2ec7bc18419e43e116555a28"} Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.787514 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"714778b9-d1d0-4767-a601-5bc178cd3199","Type":"ContainerStarted","Data":"87d94d15bfd6241710db15618d2e85fdc989294b23eff43de2c30b7dc795f29c"} Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.787633 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"714778b9-d1d0-4767-a601-5bc178cd3199","Type":"ContainerStarted","Data":"64bf4edfb188e347c15c9c4507093e0c5d8f194d31b524deb17f32349da5df24"} Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.787665 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"714778b9-d1d0-4767-a601-5bc178cd3199","Type":"ContainerStarted","Data":"b1659684d4da91038482eead67546c02ee57cfb51fbe23faf8f852afbb986576"} Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.813649 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.813630849 podStartE2EDuration="4.813630849s" podCreationTimestamp="2025-10-08 19:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:38:39.806573223 +0000 UTC m=+5275.719544246" watchObservedRunningTime="2025-10-08 19:38:39.813630849 +0000 UTC m=+5275.726601872" Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.931946 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:39 crc kubenswrapper[4750]: I1008 19:38:39.982253 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:40 crc kubenswrapper[4750]: I1008 19:38:40.263680 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:40 crc kubenswrapper[4750]: I1008 19:38:40.348437 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:40 crc kubenswrapper[4750]: I1008 19:38:40.358837 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:40 crc kubenswrapper[4750]: I1008 19:38:40.803825 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"95eb24ea-3c42-4690-9623-99af22f79703","Type":"ContainerStarted","Data":"6fbf7b8c4a4d4775955d99720cc1790fa9e1aeb5192d41fd6d115bfa0adde435"} Oct 08 19:38:40 crc kubenswrapper[4750]: I1008 19:38:40.845113 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.845083604 podStartE2EDuration="5.845083604s" podCreationTimestamp="2025-10-08 19:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:38:40.840207812 +0000 UTC m=+5276.753178865" watchObservedRunningTime="2025-10-08 19:38:40.845083604 +0000 UTC m=+5276.758054647" Oct 08 19:38:41 crc kubenswrapper[4750]: I1008 19:38:41.931690 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:41 crc kubenswrapper[4750]: I1008 19:38:41.982117 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:42 crc kubenswrapper[4750]: I1008 19:38:42.263673 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:42 crc kubenswrapper[4750]: I1008 19:38:42.332265 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:42 crc kubenswrapper[4750]: I1008 19:38:42.347855 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:42 crc kubenswrapper[4750]: I1008 19:38:42.358826 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.009079 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.042536 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.090933 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.109394 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.320017 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.334878 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.375689 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.411263 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.411819 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.418939 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.434512 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c947bfff-cq9r7"] Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.436497 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.446816 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.461531 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c947bfff-cq9r7"] Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.473603 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.568820 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-config\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.568876 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.569082 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-dns-svc\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.569307 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbfnp\" (UniqueName: \"kubernetes.io/projected/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-kube-api-access-dbfnp\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.671298 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.671379 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-dns-svc\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.671446 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbfnp\" (UniqueName: \"kubernetes.io/projected/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-kube-api-access-dbfnp\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.671521 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-config\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.672737 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-config\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.673220 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.673299 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-dns-svc\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.698511 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbfnp\" (UniqueName: \"kubernetes.io/projected/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-kube-api-access-dbfnp\") pod \"dnsmasq-dns-5c947bfff-cq9r7\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.759334 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.899853 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.947568 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c947bfff-cq9r7"] Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.981331 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ffd6fd84f-p46d7"] Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.984020 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:43 crc kubenswrapper[4750]: I1008 19:38:43.992232 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.003696 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ffd6fd84f-p46d7"] Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.087337 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.087440 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-config\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.087510 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzxxk\" (UniqueName: \"kubernetes.io/projected/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-kube-api-access-fzxxk\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.087532 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.087574 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-dns-svc\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.191076 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-config\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.191212 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzxxk\" (UniqueName: \"kubernetes.io/projected/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-kube-api-access-fzxxk\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.191248 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.191713 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-dns-svc\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.191770 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.192363 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.192363 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-config\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.193244 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-dns-svc\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.193291 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.220208 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzxxk\" (UniqueName: \"kubernetes.io/projected/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-kube-api-access-fzxxk\") pod \"dnsmasq-dns-5ffd6fd84f-p46d7\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.318270 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.362412 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c947bfff-cq9r7"] Oct 08 19:38:44 crc kubenswrapper[4750]: W1008 19:38:44.367976 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6a3a16c_640a_47e4_b105_e7b0c12e8db6.slice/crio-2ab1584748885cd72c57c311c82e88879a19530c319ec0c85b0d484c983db58a WatchSource:0}: Error finding container 2ab1584748885cd72c57c311c82e88879a19530c319ec0c85b0d484c983db58a: Status 404 returned error can't find the container with id 2ab1584748885cd72c57c311c82e88879a19530c319ec0c85b0d484c983db58a Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.819979 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ffd6fd84f-p46d7"] Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.852079 4750 generic.go:334] "Generic (PLEG): container finished" podID="e6a3a16c-640a-47e4-b105-e7b0c12e8db6" containerID="edee0335560b768de708d51bfbbc178f8f511717b6e171865384eed77d45e06c" exitCode=0 Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.852172 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" event={"ID":"e6a3a16c-640a-47e4-b105-e7b0c12e8db6","Type":"ContainerDied","Data":"edee0335560b768de708d51bfbbc178f8f511717b6e171865384eed77d45e06c"} Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.852239 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" event={"ID":"e6a3a16c-640a-47e4-b105-e7b0c12e8db6","Type":"ContainerStarted","Data":"2ab1584748885cd72c57c311c82e88879a19530c319ec0c85b0d484c983db58a"} Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.859882 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" event={"ID":"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5","Type":"ContainerStarted","Data":"bd8c10d7a9e4ee40da3e49f3f2f4f7a90068d96257e6285c67152fec1e3d7217"} Oct 08 19:38:44 crc kubenswrapper[4750]: I1008 19:38:44.924870 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.113764 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.212455 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-dns-svc\") pod \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.213626 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbfnp\" (UniqueName: \"kubernetes.io/projected/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-kube-api-access-dbfnp\") pod \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.214075 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-ovsdbserver-sb\") pod \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.214127 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-config\") pod \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\" (UID: \"e6a3a16c-640a-47e4-b105-e7b0c12e8db6\") " Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.219363 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-kube-api-access-dbfnp" (OuterVolumeSpecName: "kube-api-access-dbfnp") pod "e6a3a16c-640a-47e4-b105-e7b0c12e8db6" (UID: "e6a3a16c-640a-47e4-b105-e7b0c12e8db6"). InnerVolumeSpecName "kube-api-access-dbfnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.240040 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6a3a16c-640a-47e4-b105-e7b0c12e8db6" (UID: "e6a3a16c-640a-47e4-b105-e7b0c12e8db6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.240369 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6a3a16c-640a-47e4-b105-e7b0c12e8db6" (UID: "e6a3a16c-640a-47e4-b105-e7b0c12e8db6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.241304 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-config" (OuterVolumeSpecName: "config") pod "e6a3a16c-640a-47e4-b105-e7b0c12e8db6" (UID: "e6a3a16c-640a-47e4-b105-e7b0c12e8db6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.316761 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.317008 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbfnp\" (UniqueName: \"kubernetes.io/projected/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-kube-api-access-dbfnp\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.317067 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.317120 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6a3a16c-640a-47e4-b105-e7b0c12e8db6-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.871126 4750 generic.go:334] "Generic (PLEG): container finished" podID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" containerID="5776ffa4c79ad777ceb9794893e3563e5fa4ed76a5a88fd791c490699dd82f89" exitCode=0 Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.871249 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" event={"ID":"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5","Type":"ContainerDied","Data":"5776ffa4c79ad777ceb9794893e3563e5fa4ed76a5a88fd791c490699dd82f89"} Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.872883 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" event={"ID":"e6a3a16c-640a-47e4-b105-e7b0c12e8db6","Type":"ContainerDied","Data":"2ab1584748885cd72c57c311c82e88879a19530c319ec0c85b0d484c983db58a"} Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.872930 4750 scope.go:117] "RemoveContainer" containerID="edee0335560b768de708d51bfbbc178f8f511717b6e171865384eed77d45e06c" Oct 08 19:38:45 crc kubenswrapper[4750]: I1008 19:38:45.873073 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c947bfff-cq9r7" Oct 08 19:38:46 crc kubenswrapper[4750]: I1008 19:38:46.119761 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c947bfff-cq9r7"] Oct 08 19:38:46 crc kubenswrapper[4750]: I1008 19:38:46.126054 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c947bfff-cq9r7"] Oct 08 19:38:46 crc kubenswrapper[4750]: I1008 19:38:46.749809 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a3a16c-640a-47e4-b105-e7b0c12e8db6" path="/var/lib/kubelet/pods/e6a3a16c-640a-47e4-b105-e7b0c12e8db6/volumes" Oct 08 19:38:46 crc kubenswrapper[4750]: I1008 19:38:46.887083 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" event={"ID":"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5","Type":"ContainerStarted","Data":"b0d83c5b5697d86f822999beb851e2fd21d142caac9b119b2127c87dccbab396"} Oct 08 19:38:46 crc kubenswrapper[4750]: I1008 19:38:46.887496 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:46 crc kubenswrapper[4750]: I1008 19:38:46.913255 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" podStartSLOduration=3.91322294 podStartE2EDuration="3.91322294s" podCreationTimestamp="2025-10-08 19:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:38:46.912328167 +0000 UTC m=+5282.825299190" watchObservedRunningTime="2025-10-08 19:38:46.91322294 +0000 UTC m=+5282.826193953" Oct 08 19:38:47 crc kubenswrapper[4750]: I1008 19:38:47.976767 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 08 19:38:47 crc kubenswrapper[4750]: E1008 19:38:47.979538 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a3a16c-640a-47e4-b105-e7b0c12e8db6" containerName="init" Oct 08 19:38:47 crc kubenswrapper[4750]: I1008 19:38:47.979728 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a3a16c-640a-47e4-b105-e7b0c12e8db6" containerName="init" Oct 08 19:38:47 crc kubenswrapper[4750]: I1008 19:38:47.981377 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a3a16c-640a-47e4-b105-e7b0c12e8db6" containerName="init" Oct 08 19:38:47 crc kubenswrapper[4750]: I1008 19:38:47.983062 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 08 19:38:47 crc kubenswrapper[4750]: I1008 19:38:47.992801 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 08 19:38:47 crc kubenswrapper[4750]: I1008 19:38:47.993622 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.065148 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-22bb5796-6bca-4ca9-abf9-67ba09443c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22bb5796-6bca-4ca9-abf9-67ba09443c09\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.065931 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzwd9\" (UniqueName: \"kubernetes.io/projected/eb40b769-eb69-4127-92aa-8520cf6c0883-kube-api-access-kzwd9\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.066260 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/eb40b769-eb69-4127-92aa-8520cf6c0883-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.168875 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-22bb5796-6bca-4ca9-abf9-67ba09443c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22bb5796-6bca-4ca9-abf9-67ba09443c09\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.168936 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzwd9\" (UniqueName: \"kubernetes.io/projected/eb40b769-eb69-4127-92aa-8520cf6c0883-kube-api-access-kzwd9\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.169001 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/eb40b769-eb69-4127-92aa-8520cf6c0883-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.173022 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.173119 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-22bb5796-6bca-4ca9-abf9-67ba09443c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22bb5796-6bca-4ca9-abf9-67ba09443c09\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d1182ef1b4a20f66c6634a30de6af285f640ef85b12ff195ce4396b13505869a/globalmount\"" pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.190074 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/eb40b769-eb69-4127-92aa-8520cf6c0883-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.198678 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzwd9\" (UniqueName: \"kubernetes.io/projected/eb40b769-eb69-4127-92aa-8520cf6c0883-kube-api-access-kzwd9\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.209791 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-22bb5796-6bca-4ca9-abf9-67ba09443c09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22bb5796-6bca-4ca9-abf9-67ba09443c09\") pod \"ovn-copy-data\" (UID: \"eb40b769-eb69-4127-92aa-8520cf6c0883\") " pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.313088 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.900110 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 08 19:38:48 crc kubenswrapper[4750]: I1008 19:38:48.919807 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"eb40b769-eb69-4127-92aa-8520cf6c0883","Type":"ContainerStarted","Data":"9a0cbc6dbfa58c1352c5f4cf69d1ad58ec5f6a393d99ab3750db9babf3ec706e"} Oct 08 19:38:49 crc kubenswrapper[4750]: I1008 19:38:49.734930 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:38:49 crc kubenswrapper[4750]: E1008 19:38:49.735636 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:38:49 crc kubenswrapper[4750]: I1008 19:38:49.934651 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"eb40b769-eb69-4127-92aa-8520cf6c0883","Type":"ContainerStarted","Data":"bfbbfeab83342987b24aa85ccc6e69f7ce2829e17346b44b3a8e3773de29d2c4"} Oct 08 19:38:49 crc kubenswrapper[4750]: I1008 19:38:49.981331 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.460161613 podStartE2EDuration="3.98129541s" podCreationTimestamp="2025-10-08 19:38:46 +0000 UTC" firstStartedPulling="2025-10-08 19:38:48.909006137 +0000 UTC m=+5284.821977140" lastFinishedPulling="2025-10-08 19:38:49.430139924 +0000 UTC m=+5285.343110937" observedRunningTime="2025-10-08 19:38:49.975105326 +0000 UTC m=+5285.888076419" watchObservedRunningTime="2025-10-08 19:38:49.98129541 +0000 UTC m=+5285.894266463" Oct 08 19:38:52 crc kubenswrapper[4750]: E1008 19:38:52.349247 4750 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.75:48580->38.102.83.75:34167: read tcp 38.102.83.75:48580->38.102.83.75:34167: read: connection reset by peer Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.319840 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.384187 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-rj9w9"] Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.384528 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" podUID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" containerName="dnsmasq-dns" containerID="cri-o://7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d" gracePeriod=10 Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.918154 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.994870 4750 generic.go:334] "Generic (PLEG): container finished" podID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" containerID="7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d" exitCode=0 Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.994926 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" event={"ID":"eb22965d-c55e-4c81-adf0-7a6f84e5494e","Type":"ContainerDied","Data":"7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d"} Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.994977 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" event={"ID":"eb22965d-c55e-4c81-adf0-7a6f84e5494e","Type":"ContainerDied","Data":"e3710c239b3769ef6e9953edb8044fd80c2d0844e229981ef971518ee9b29103"} Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.995011 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc957c47-rj9w9" Oct 08 19:38:54 crc kubenswrapper[4750]: I1008 19:38:54.995116 4750 scope.go:117] "RemoveContainer" containerID="7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.025383 4750 scope.go:117] "RemoveContainer" containerID="b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.071736 4750 scope.go:117] "RemoveContainer" containerID="7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d" Oct 08 19:38:55 crc kubenswrapper[4750]: E1008 19:38:55.072488 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d\": container with ID starting with 7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d not found: ID does not exist" containerID="7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.072569 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d"} err="failed to get container status \"7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d\": rpc error: code = NotFound desc = could not find container \"7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d\": container with ID starting with 7d2cc2c5760f5f0424a341b0ee36a1c4e054084f523d99aa37a746a4fb99b61d not found: ID does not exist" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.072605 4750 scope.go:117] "RemoveContainer" containerID="b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f" Oct 08 19:38:55 crc kubenswrapper[4750]: E1008 19:38:55.075128 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f\": container with ID starting with b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f not found: ID does not exist" containerID="b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.075157 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f"} err="failed to get container status \"b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f\": rpc error: code = NotFound desc = could not find container \"b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f\": container with ID starting with b15e6bcc3bd8fd63308224daadb590c4ac9795da0122f416dce0aae8f576a82f not found: ID does not exist" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.086020 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 08 19:38:55 crc kubenswrapper[4750]: E1008 19:38:55.086464 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" containerName="dnsmasq-dns" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.086483 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" containerName="dnsmasq-dns" Oct 08 19:38:55 crc kubenswrapper[4750]: E1008 19:38:55.086511 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" containerName="init" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.086518 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" containerName="init" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.086705 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" containerName="dnsmasq-dns" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.087767 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.092152 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-r6qkx" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.092330 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.099477 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.104165 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.121952 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-dns-svc\") pod \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.122015 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-config\") pod \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.122126 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcnzh\" (UniqueName: \"kubernetes.io/projected/eb22965d-c55e-4c81-adf0-7a6f84e5494e-kube-api-access-tcnzh\") pod \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\" (UID: \"eb22965d-c55e-4c81-adf0-7a6f84e5494e\") " Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.190830 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb22965d-c55e-4c81-adf0-7a6f84e5494e-kube-api-access-tcnzh" (OuterVolumeSpecName: "kube-api-access-tcnzh") pod "eb22965d-c55e-4c81-adf0-7a6f84e5494e" (UID: "eb22965d-c55e-4c81-adf0-7a6f84e5494e"). InnerVolumeSpecName "kube-api-access-tcnzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.224762 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-scripts\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.224855 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-config\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.224884 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.224911 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.224935 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjfb\" (UniqueName: \"kubernetes.io/projected/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-kube-api-access-jcjfb\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.224977 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcnzh\" (UniqueName: \"kubernetes.io/projected/eb22965d-c55e-4c81-adf0-7a6f84e5494e-kube-api-access-tcnzh\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.235747 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-config" (OuterVolumeSpecName: "config") pod "eb22965d-c55e-4c81-adf0-7a6f84e5494e" (UID: "eb22965d-c55e-4c81-adf0-7a6f84e5494e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.249062 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb22965d-c55e-4c81-adf0-7a6f84e5494e" (UID: "eb22965d-c55e-4c81-adf0-7a6f84e5494e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.326859 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-scripts\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.326974 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-config\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.326998 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.327046 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.327066 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjfb\" (UniqueName: \"kubernetes.io/projected/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-kube-api-access-jcjfb\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.327120 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.327131 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb22965d-c55e-4c81-adf0-7a6f84e5494e-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.328288 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-scripts\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.328878 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-config\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.329497 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.334207 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.356303 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjfb\" (UniqueName: \"kubernetes.io/projected/139a64cf-5b9a-40b4-b6b7-ab8132b9a856-kube-api-access-jcjfb\") pod \"ovn-northd-0\" (UID: \"139a64cf-5b9a-40b4-b6b7-ab8132b9a856\") " pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.357643 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-rj9w9"] Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.363880 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdc957c47-rj9w9"] Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.446341 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 19:38:55 crc kubenswrapper[4750]: I1008 19:38:55.946312 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 19:38:56 crc kubenswrapper[4750]: I1008 19:38:56.010169 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"139a64cf-5b9a-40b4-b6b7-ab8132b9a856","Type":"ContainerStarted","Data":"917fd98dfd7c3a47718e18e8adfcb14e9a2c74595b640e6c5591dfb8ad388c88"} Oct 08 19:38:56 crc kubenswrapper[4750]: I1008 19:38:56.746672 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb22965d-c55e-4c81-adf0-7a6f84e5494e" path="/var/lib/kubelet/pods/eb22965d-c55e-4c81-adf0-7a6f84e5494e/volumes" Oct 08 19:38:57 crc kubenswrapper[4750]: I1008 19:38:57.024122 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"139a64cf-5b9a-40b4-b6b7-ab8132b9a856","Type":"ContainerStarted","Data":"de0feaf4759bf84574d7dcd203d51ac710b9c70eef6391c6c36e0d32c9ae9330"} Oct 08 19:38:57 crc kubenswrapper[4750]: I1008 19:38:57.024186 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"139a64cf-5b9a-40b4-b6b7-ab8132b9a856","Type":"ContainerStarted","Data":"049f1c0543e9b5965d3fc99e1bcfef73f39fd6f1990c764814393693d1ada54e"} Oct 08 19:38:57 crc kubenswrapper[4750]: I1008 19:38:57.024838 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 08 19:38:57 crc kubenswrapper[4750]: I1008 19:38:57.053101 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.053082109 podStartE2EDuration="2.053082109s" podCreationTimestamp="2025-10-08 19:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:38:57.047386357 +0000 UTC m=+5292.960357420" watchObservedRunningTime="2025-10-08 19:38:57.053082109 +0000 UTC m=+5292.966053122" Oct 08 19:39:00 crc kubenswrapper[4750]: I1008 19:39:00.735295 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:39:00 crc kubenswrapper[4750]: E1008 19:39:00.736438 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:39:00 crc kubenswrapper[4750]: I1008 19:39:00.922504 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8snxc"] Oct 08 19:39:00 crc kubenswrapper[4750]: I1008 19:39:00.923854 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8snxc" Oct 08 19:39:00 crc kubenswrapper[4750]: I1008 19:39:00.935445 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8snxc"] Oct 08 19:39:01 crc kubenswrapper[4750]: I1008 19:39:01.055766 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5qd\" (UniqueName: \"kubernetes.io/projected/f042f49a-6626-4f29-8f84-2da116657330-kube-api-access-gt5qd\") pod \"keystone-db-create-8snxc\" (UID: \"f042f49a-6626-4f29-8f84-2da116657330\") " pod="openstack/keystone-db-create-8snxc" Oct 08 19:39:01 crc kubenswrapper[4750]: I1008 19:39:01.157349 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5qd\" (UniqueName: \"kubernetes.io/projected/f042f49a-6626-4f29-8f84-2da116657330-kube-api-access-gt5qd\") pod \"keystone-db-create-8snxc\" (UID: \"f042f49a-6626-4f29-8f84-2da116657330\") " pod="openstack/keystone-db-create-8snxc" Oct 08 19:39:01 crc kubenswrapper[4750]: I1008 19:39:01.188535 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5qd\" (UniqueName: \"kubernetes.io/projected/f042f49a-6626-4f29-8f84-2da116657330-kube-api-access-gt5qd\") pod \"keystone-db-create-8snxc\" (UID: \"f042f49a-6626-4f29-8f84-2da116657330\") " pod="openstack/keystone-db-create-8snxc" Oct 08 19:39:01 crc kubenswrapper[4750]: I1008 19:39:01.248093 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8snxc" Oct 08 19:39:01 crc kubenswrapper[4750]: I1008 19:39:01.740684 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8snxc"] Oct 08 19:39:01 crc kubenswrapper[4750]: W1008 19:39:01.744852 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf042f49a_6626_4f29_8f84_2da116657330.slice/crio-d7fe7888438585de682f51b669f9c70605103568c4811e2c41328d3de683d27a WatchSource:0}: Error finding container d7fe7888438585de682f51b669f9c70605103568c4811e2c41328d3de683d27a: Status 404 returned error can't find the container with id d7fe7888438585de682f51b669f9c70605103568c4811e2c41328d3de683d27a Oct 08 19:39:02 crc kubenswrapper[4750]: I1008 19:39:02.083628 4750 generic.go:334] "Generic (PLEG): container finished" podID="f042f49a-6626-4f29-8f84-2da116657330" containerID="5e78fa5b8a46197551b717a36e1a9a9846aec41994aa4b310cb064fb3d3ddb31" exitCode=0 Oct 08 19:39:02 crc kubenswrapper[4750]: I1008 19:39:02.083706 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8snxc" event={"ID":"f042f49a-6626-4f29-8f84-2da116657330","Type":"ContainerDied","Data":"5e78fa5b8a46197551b717a36e1a9a9846aec41994aa4b310cb064fb3d3ddb31"} Oct 08 19:39:02 crc kubenswrapper[4750]: I1008 19:39:02.084863 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8snxc" event={"ID":"f042f49a-6626-4f29-8f84-2da116657330","Type":"ContainerStarted","Data":"d7fe7888438585de682f51b669f9c70605103568c4811e2c41328d3de683d27a"} Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.109427 4750 scope.go:117] "RemoveContainer" containerID="6a177b9ea171a3b77046c64dc9ed918a7a6e04f4f09cdcfd2dad1faf4e1d67e9" Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.142149 4750 scope.go:117] "RemoveContainer" containerID="1abc6208ea85b1a98af8b03ed8c49e3e7f2dd09c3b0768fe75f381dd3bf88c93" Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.208885 4750 scope.go:117] "RemoveContainer" containerID="6884c875b92de39684d14bb3858d43508ee4e28c09d770a4824ee11d381bed5b" Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.246581 4750 scope.go:117] "RemoveContainer" containerID="c83adb2e8c78b3f8169342cdfdd81d447670a1af31be2b1690b870d77bef4f09" Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.287735 4750 scope.go:117] "RemoveContainer" containerID="456ebeefa91a765a33447733bee565b1f5a98f5688cb0bba3ae1755f1dc58368" Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.326828 4750 scope.go:117] "RemoveContainer" containerID="1fd8455bed30a674374e17da2eabd2825de4b4d3e7112dd8295134025a705397" Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.424607 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8snxc" Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.602265 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt5qd\" (UniqueName: \"kubernetes.io/projected/f042f49a-6626-4f29-8f84-2da116657330-kube-api-access-gt5qd\") pod \"f042f49a-6626-4f29-8f84-2da116657330\" (UID: \"f042f49a-6626-4f29-8f84-2da116657330\") " Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.612710 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f042f49a-6626-4f29-8f84-2da116657330-kube-api-access-gt5qd" (OuterVolumeSpecName: "kube-api-access-gt5qd") pod "f042f49a-6626-4f29-8f84-2da116657330" (UID: "f042f49a-6626-4f29-8f84-2da116657330"). InnerVolumeSpecName "kube-api-access-gt5qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:39:03 crc kubenswrapper[4750]: I1008 19:39:03.705820 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt5qd\" (UniqueName: \"kubernetes.io/projected/f042f49a-6626-4f29-8f84-2da116657330-kube-api-access-gt5qd\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:04 crc kubenswrapper[4750]: I1008 19:39:04.110872 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8snxc" event={"ID":"f042f49a-6626-4f29-8f84-2da116657330","Type":"ContainerDied","Data":"d7fe7888438585de682f51b669f9c70605103568c4811e2c41328d3de683d27a"} Oct 08 19:39:04 crc kubenswrapper[4750]: I1008 19:39:04.111409 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7fe7888438585de682f51b669f9c70605103568c4811e2c41328d3de683d27a" Oct 08 19:39:04 crc kubenswrapper[4750]: I1008 19:39:04.110989 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8snxc" Oct 08 19:39:10 crc kubenswrapper[4750]: I1008 19:39:10.550055 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.030918 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-269e-account-create-8kk82"] Oct 08 19:39:11 crc kubenswrapper[4750]: E1008 19:39:11.031347 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f042f49a-6626-4f29-8f84-2da116657330" containerName="mariadb-database-create" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.031363 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f042f49a-6626-4f29-8f84-2da116657330" containerName="mariadb-database-create" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.031580 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f042f49a-6626-4f29-8f84-2da116657330" containerName="mariadb-database-create" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.032238 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-269e-account-create-8kk82" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.035077 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.046890 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-269e-account-create-8kk82"] Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.159967 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h7vn\" (UniqueName: \"kubernetes.io/projected/db277cda-d138-4a8e-a30c-b767ce163d5b-kube-api-access-5h7vn\") pod \"keystone-269e-account-create-8kk82\" (UID: \"db277cda-d138-4a8e-a30c-b767ce163d5b\") " pod="openstack/keystone-269e-account-create-8kk82" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.262047 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7vn\" (UniqueName: \"kubernetes.io/projected/db277cda-d138-4a8e-a30c-b767ce163d5b-kube-api-access-5h7vn\") pod \"keystone-269e-account-create-8kk82\" (UID: \"db277cda-d138-4a8e-a30c-b767ce163d5b\") " pod="openstack/keystone-269e-account-create-8kk82" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.294675 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h7vn\" (UniqueName: \"kubernetes.io/projected/db277cda-d138-4a8e-a30c-b767ce163d5b-kube-api-access-5h7vn\") pod \"keystone-269e-account-create-8kk82\" (UID: \"db277cda-d138-4a8e-a30c-b767ce163d5b\") " pod="openstack/keystone-269e-account-create-8kk82" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.358694 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-269e-account-create-8kk82" Oct 08 19:39:11 crc kubenswrapper[4750]: I1008 19:39:11.894904 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-269e-account-create-8kk82"] Oct 08 19:39:11 crc kubenswrapper[4750]: W1008 19:39:11.905202 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb277cda_d138_4a8e_a30c_b767ce163d5b.slice/crio-e3c389bdcfb9a1b04eca58757ad9c196b9060844282a108b8c1919dbd3ade0c0 WatchSource:0}: Error finding container e3c389bdcfb9a1b04eca58757ad9c196b9060844282a108b8c1919dbd3ade0c0: Status 404 returned error can't find the container with id e3c389bdcfb9a1b04eca58757ad9c196b9060844282a108b8c1919dbd3ade0c0 Oct 08 19:39:12 crc kubenswrapper[4750]: I1008 19:39:12.199259 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-269e-account-create-8kk82" event={"ID":"db277cda-d138-4a8e-a30c-b767ce163d5b","Type":"ContainerStarted","Data":"e6d6f0860428ad7d7170a7ebca0a3d6260c7fdbee74938172fda4bfb0352625d"} Oct 08 19:39:12 crc kubenswrapper[4750]: I1008 19:39:12.199330 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-269e-account-create-8kk82" event={"ID":"db277cda-d138-4a8e-a30c-b767ce163d5b","Type":"ContainerStarted","Data":"e3c389bdcfb9a1b04eca58757ad9c196b9060844282a108b8c1919dbd3ade0c0"} Oct 08 19:39:12 crc kubenswrapper[4750]: I1008 19:39:12.228097 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-269e-account-create-8kk82" podStartSLOduration=1.22807018 podStartE2EDuration="1.22807018s" podCreationTimestamp="2025-10-08 19:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:39:12.219086027 +0000 UTC m=+5308.132057050" watchObservedRunningTime="2025-10-08 19:39:12.22807018 +0000 UTC m=+5308.141041203" Oct 08 19:39:13 crc kubenswrapper[4750]: I1008 19:39:13.216657 4750 generic.go:334] "Generic (PLEG): container finished" podID="db277cda-d138-4a8e-a30c-b767ce163d5b" containerID="e6d6f0860428ad7d7170a7ebca0a3d6260c7fdbee74938172fda4bfb0352625d" exitCode=0 Oct 08 19:39:13 crc kubenswrapper[4750]: I1008 19:39:13.216853 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-269e-account-create-8kk82" event={"ID":"db277cda-d138-4a8e-a30c-b767ce163d5b","Type":"ContainerDied","Data":"e6d6f0860428ad7d7170a7ebca0a3d6260c7fdbee74938172fda4bfb0352625d"} Oct 08 19:39:14 crc kubenswrapper[4750]: I1008 19:39:14.577774 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-269e-account-create-8kk82" Oct 08 19:39:14 crc kubenswrapper[4750]: I1008 19:39:14.736697 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h7vn\" (UniqueName: \"kubernetes.io/projected/db277cda-d138-4a8e-a30c-b767ce163d5b-kube-api-access-5h7vn\") pod \"db277cda-d138-4a8e-a30c-b767ce163d5b\" (UID: \"db277cda-d138-4a8e-a30c-b767ce163d5b\") " Oct 08 19:39:14 crc kubenswrapper[4750]: I1008 19:39:14.747459 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db277cda-d138-4a8e-a30c-b767ce163d5b-kube-api-access-5h7vn" (OuterVolumeSpecName: "kube-api-access-5h7vn") pod "db277cda-d138-4a8e-a30c-b767ce163d5b" (UID: "db277cda-d138-4a8e-a30c-b767ce163d5b"). InnerVolumeSpecName "kube-api-access-5h7vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:39:14 crc kubenswrapper[4750]: I1008 19:39:14.841539 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h7vn\" (UniqueName: \"kubernetes.io/projected/db277cda-d138-4a8e-a30c-b767ce163d5b-kube-api-access-5h7vn\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:15 crc kubenswrapper[4750]: I1008 19:39:15.241657 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-269e-account-create-8kk82" event={"ID":"db277cda-d138-4a8e-a30c-b767ce163d5b","Type":"ContainerDied","Data":"e3c389bdcfb9a1b04eca58757ad9c196b9060844282a108b8c1919dbd3ade0c0"} Oct 08 19:39:15 crc kubenswrapper[4750]: I1008 19:39:15.241723 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3c389bdcfb9a1b04eca58757ad9c196b9060844282a108b8c1919dbd3ade0c0" Oct 08 19:39:15 crc kubenswrapper[4750]: I1008 19:39:15.241746 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-269e-account-create-8kk82" Oct 08 19:39:15 crc kubenswrapper[4750]: I1008 19:39:15.734537 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:39:15 crc kubenswrapper[4750]: E1008 19:39:15.735156 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.467183 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-d2hls"] Oct 08 19:39:16 crc kubenswrapper[4750]: E1008 19:39:16.468171 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db277cda-d138-4a8e-a30c-b767ce163d5b" containerName="mariadb-account-create" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.468194 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="db277cda-d138-4a8e-a30c-b767ce163d5b" containerName="mariadb-account-create" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.468434 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="db277cda-d138-4a8e-a30c-b767ce163d5b" containerName="mariadb-account-create" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.469356 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.488121 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.488477 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.488871 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lvddj" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.489651 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.491183 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-d2hls"] Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.579286 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5rrx\" (UniqueName: \"kubernetes.io/projected/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-kube-api-access-t5rrx\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.579363 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-combined-ca-bundle\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.579837 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-config-data\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.682206 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-combined-ca-bundle\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.682517 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-config-data\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.682662 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5rrx\" (UniqueName: \"kubernetes.io/projected/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-kube-api-access-t5rrx\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.690291 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-combined-ca-bundle\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.690752 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-config-data\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.705469 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5rrx\" (UniqueName: \"kubernetes.io/projected/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-kube-api-access-t5rrx\") pod \"keystone-db-sync-d2hls\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:16 crc kubenswrapper[4750]: I1008 19:39:16.790678 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:17 crc kubenswrapper[4750]: I1008 19:39:17.313349 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-d2hls"] Oct 08 19:39:17 crc kubenswrapper[4750]: W1008 19:39:17.324014 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e171e3a_c2b0_4b44_8a1b_7d345e7e9545.slice/crio-e4c410d826285512bf70344a321ff83dff6ab7ec5ec1da5502fe406e6c000962 WatchSource:0}: Error finding container e4c410d826285512bf70344a321ff83dff6ab7ec5ec1da5502fe406e6c000962: Status 404 returned error can't find the container with id e4c410d826285512bf70344a321ff83dff6ab7ec5ec1da5502fe406e6c000962 Oct 08 19:39:18 crc kubenswrapper[4750]: I1008 19:39:18.276659 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d2hls" event={"ID":"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545","Type":"ContainerStarted","Data":"134dec9c46b49f238cea8b903cbf59e9f6f06bb1e97ba812b4bcfbb71f17ea24"} Oct 08 19:39:18 crc kubenswrapper[4750]: I1008 19:39:18.277337 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d2hls" event={"ID":"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545","Type":"ContainerStarted","Data":"e4c410d826285512bf70344a321ff83dff6ab7ec5ec1da5502fe406e6c000962"} Oct 08 19:39:18 crc kubenswrapper[4750]: I1008 19:39:18.297392 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-d2hls" podStartSLOduration=2.297365626 podStartE2EDuration="2.297365626s" podCreationTimestamp="2025-10-08 19:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:39:18.296038662 +0000 UTC m=+5314.209009685" watchObservedRunningTime="2025-10-08 19:39:18.297365626 +0000 UTC m=+5314.210336639" Oct 08 19:39:19 crc kubenswrapper[4750]: I1008 19:39:19.293379 4750 generic.go:334] "Generic (PLEG): container finished" podID="9e171e3a-c2b0-4b44-8a1b-7d345e7e9545" containerID="134dec9c46b49f238cea8b903cbf59e9f6f06bb1e97ba812b4bcfbb71f17ea24" exitCode=0 Oct 08 19:39:19 crc kubenswrapper[4750]: I1008 19:39:19.293486 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d2hls" event={"ID":"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545","Type":"ContainerDied","Data":"134dec9c46b49f238cea8b903cbf59e9f6f06bb1e97ba812b4bcfbb71f17ea24"} Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.680653 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.824057 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-combined-ca-bundle\") pod \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.824133 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-config-data\") pod \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.824330 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5rrx\" (UniqueName: \"kubernetes.io/projected/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-kube-api-access-t5rrx\") pod \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\" (UID: \"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545\") " Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.839080 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-kube-api-access-t5rrx" (OuterVolumeSpecName: "kube-api-access-t5rrx") pod "9e171e3a-c2b0-4b44-8a1b-7d345e7e9545" (UID: "9e171e3a-c2b0-4b44-8a1b-7d345e7e9545"). InnerVolumeSpecName "kube-api-access-t5rrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.864871 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e171e3a-c2b0-4b44-8a1b-7d345e7e9545" (UID: "9e171e3a-c2b0-4b44-8a1b-7d345e7e9545"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.881832 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-config-data" (OuterVolumeSpecName: "config-data") pod "9e171e3a-c2b0-4b44-8a1b-7d345e7e9545" (UID: "9e171e3a-c2b0-4b44-8a1b-7d345e7e9545"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.926583 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5rrx\" (UniqueName: \"kubernetes.io/projected/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-kube-api-access-t5rrx\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.926652 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:20 crc kubenswrapper[4750]: I1008 19:39:20.926672 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.323985 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d2hls" event={"ID":"9e171e3a-c2b0-4b44-8a1b-7d345e7e9545","Type":"ContainerDied","Data":"e4c410d826285512bf70344a321ff83dff6ab7ec5ec1da5502fe406e6c000962"} Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.324067 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4c410d826285512bf70344a321ff83dff6ab7ec5ec1da5502fe406e6c000962" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.324172 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d2hls" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.631668 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rd5l8"] Oct 08 19:39:21 crc kubenswrapper[4750]: E1008 19:39:21.632198 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e171e3a-c2b0-4b44-8a1b-7d345e7e9545" containerName="keystone-db-sync" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.632219 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e171e3a-c2b0-4b44-8a1b-7d345e7e9545" containerName="keystone-db-sync" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.632446 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e171e3a-c2b0-4b44-8a1b-7d345e7e9545" containerName="keystone-db-sync" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.633239 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.635781 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.637768 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lvddj" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.637963 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.638135 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.642653 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-combined-ca-bundle\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.642728 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-fernet-keys\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.642759 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-credential-keys\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.642778 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b86wj\" (UniqueName: \"kubernetes.io/projected/dfee363b-c0e2-4747-b197-fff595e23653-kube-api-access-b86wj\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.642815 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-config-data\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.642847 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-scripts\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.647015 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77f55cbcff-gwv8t"] Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.649258 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.656632 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rd5l8"] Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.668018 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f55cbcff-gwv8t"] Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745054 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-dns-svc\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745122 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-sb\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745153 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-combined-ca-bundle\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745210 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2js8m\" (UniqueName: \"kubernetes.io/projected/2f363218-3c2f-46a8-84cc-985e5db78d56-kube-api-access-2js8m\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745251 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-fernet-keys\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745273 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-credential-keys\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745294 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b86wj\" (UniqueName: \"kubernetes.io/projected/dfee363b-c0e2-4747-b197-fff595e23653-kube-api-access-b86wj\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745313 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-nb\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745335 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-config-data\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745372 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-config\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.745445 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-scripts\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.752813 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-credential-keys\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.752999 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-scripts\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.756693 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-config-data\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.752343 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-fernet-keys\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.768706 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-combined-ca-bundle\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.768874 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b86wj\" (UniqueName: \"kubernetes.io/projected/dfee363b-c0e2-4747-b197-fff595e23653-kube-api-access-b86wj\") pod \"keystone-bootstrap-rd5l8\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.846808 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-dns-svc\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.847352 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-sb\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.847399 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2js8m\" (UniqueName: \"kubernetes.io/projected/2f363218-3c2f-46a8-84cc-985e5db78d56-kube-api-access-2js8m\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.847441 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-nb\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.847475 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-config\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.848179 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-dns-svc\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.848386 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-sb\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.848532 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-config\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.848907 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-nb\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.871743 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2js8m\" (UniqueName: \"kubernetes.io/projected/2f363218-3c2f-46a8-84cc-985e5db78d56-kube-api-access-2js8m\") pod \"dnsmasq-dns-77f55cbcff-gwv8t\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.962161 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:21 crc kubenswrapper[4750]: I1008 19:39:21.980968 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:22 crc kubenswrapper[4750]: I1008 19:39:22.253673 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rd5l8"] Oct 08 19:39:22 crc kubenswrapper[4750]: I1008 19:39:22.335678 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rd5l8" event={"ID":"dfee363b-c0e2-4747-b197-fff595e23653","Type":"ContainerStarted","Data":"01dd12fc5c70e41059788f1a3f6f5b3b5b46ac99c72fb51d6f75957ae52a9a67"} Oct 08 19:39:22 crc kubenswrapper[4750]: I1008 19:39:22.352140 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f55cbcff-gwv8t"] Oct 08 19:39:23 crc kubenswrapper[4750]: I1008 19:39:23.350755 4750 generic.go:334] "Generic (PLEG): container finished" podID="2f363218-3c2f-46a8-84cc-985e5db78d56" containerID="3378856a0503ed5a92bfcede45ce6e23a7d3cb5780b394041d241c8e585d742a" exitCode=0 Oct 08 19:39:23 crc kubenswrapper[4750]: I1008 19:39:23.350842 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" event={"ID":"2f363218-3c2f-46a8-84cc-985e5db78d56","Type":"ContainerDied","Data":"3378856a0503ed5a92bfcede45ce6e23a7d3cb5780b394041d241c8e585d742a"} Oct 08 19:39:23 crc kubenswrapper[4750]: I1008 19:39:23.351290 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" event={"ID":"2f363218-3c2f-46a8-84cc-985e5db78d56","Type":"ContainerStarted","Data":"5847ed4245d97e1690df9f5d66523882c8e26934f589df0ed1c45fd68f5f10ae"} Oct 08 19:39:23 crc kubenswrapper[4750]: I1008 19:39:23.354229 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rd5l8" event={"ID":"dfee363b-c0e2-4747-b197-fff595e23653","Type":"ContainerStarted","Data":"30072bff486fa535b27f17325f43675ac18fd6c5ae3424ff79cec63176f583d0"} Oct 08 19:39:23 crc kubenswrapper[4750]: I1008 19:39:23.424255 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rd5l8" podStartSLOduration=2.424227523 podStartE2EDuration="2.424227523s" podCreationTimestamp="2025-10-08 19:39:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:39:23.412740507 +0000 UTC m=+5319.325711550" watchObservedRunningTime="2025-10-08 19:39:23.424227523 +0000 UTC m=+5319.337198556" Oct 08 19:39:24 crc kubenswrapper[4750]: I1008 19:39:24.370923 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" event={"ID":"2f363218-3c2f-46a8-84cc-985e5db78d56","Type":"ContainerStarted","Data":"ccab34ae3e95c9a57ecf972c84073ab27ab5f560b07f7235b59fce97d22f10a9"} Oct 08 19:39:24 crc kubenswrapper[4750]: I1008 19:39:24.371033 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:24 crc kubenswrapper[4750]: I1008 19:39:24.405520 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" podStartSLOduration=3.405472927 podStartE2EDuration="3.405472927s" podCreationTimestamp="2025-10-08 19:39:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:39:24.403844177 +0000 UTC m=+5320.316815200" watchObservedRunningTime="2025-10-08 19:39:24.405472927 +0000 UTC m=+5320.318443970" Oct 08 19:39:26 crc kubenswrapper[4750]: I1008 19:39:26.397409 4750 generic.go:334] "Generic (PLEG): container finished" podID="dfee363b-c0e2-4747-b197-fff595e23653" containerID="30072bff486fa535b27f17325f43675ac18fd6c5ae3424ff79cec63176f583d0" exitCode=0 Oct 08 19:39:26 crc kubenswrapper[4750]: I1008 19:39:26.397511 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rd5l8" event={"ID":"dfee363b-c0e2-4747-b197-fff595e23653","Type":"ContainerDied","Data":"30072bff486fa535b27f17325f43675ac18fd6c5ae3424ff79cec63176f583d0"} Oct 08 19:39:26 crc kubenswrapper[4750]: I1008 19:39:26.735492 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:39:26 crc kubenswrapper[4750]: E1008 19:39:26.736000 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.787879 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.799083 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b86wj\" (UniqueName: \"kubernetes.io/projected/dfee363b-c0e2-4747-b197-fff595e23653-kube-api-access-b86wj\") pod \"dfee363b-c0e2-4747-b197-fff595e23653\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.799127 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-combined-ca-bundle\") pod \"dfee363b-c0e2-4747-b197-fff595e23653\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.799250 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-credential-keys\") pod \"dfee363b-c0e2-4747-b197-fff595e23653\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.799301 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-fernet-keys\") pod \"dfee363b-c0e2-4747-b197-fff595e23653\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.803713 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-scripts\") pod \"dfee363b-c0e2-4747-b197-fff595e23653\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.804332 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-config-data\") pod \"dfee363b-c0e2-4747-b197-fff595e23653\" (UID: \"dfee363b-c0e2-4747-b197-fff595e23653\") " Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.814762 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-scripts" (OuterVolumeSpecName: "scripts") pod "dfee363b-c0e2-4747-b197-fff595e23653" (UID: "dfee363b-c0e2-4747-b197-fff595e23653"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.818318 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dfee363b-c0e2-4747-b197-fff595e23653" (UID: "dfee363b-c0e2-4747-b197-fff595e23653"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.867975 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dfee363b-c0e2-4747-b197-fff595e23653" (UID: "dfee363b-c0e2-4747-b197-fff595e23653"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.873975 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-config-data" (OuterVolumeSpecName: "config-data") pod "dfee363b-c0e2-4747-b197-fff595e23653" (UID: "dfee363b-c0e2-4747-b197-fff595e23653"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.885245 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfee363b-c0e2-4747-b197-fff595e23653-kube-api-access-b86wj" (OuterVolumeSpecName: "kube-api-access-b86wj") pod "dfee363b-c0e2-4747-b197-fff595e23653" (UID: "dfee363b-c0e2-4747-b197-fff595e23653"). InnerVolumeSpecName "kube-api-access-b86wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.890494 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfee363b-c0e2-4747-b197-fff595e23653" (UID: "dfee363b-c0e2-4747-b197-fff595e23653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.907756 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.907795 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.907810 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b86wj\" (UniqueName: \"kubernetes.io/projected/dfee363b-c0e2-4747-b197-fff595e23653-kube-api-access-b86wj\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.907823 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.907832 4750 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:27 crc kubenswrapper[4750]: I1008 19:39:27.907840 4750 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfee363b-c0e2-4747-b197-fff595e23653-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.417442 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rd5l8" event={"ID":"dfee363b-c0e2-4747-b197-fff595e23653","Type":"ContainerDied","Data":"01dd12fc5c70e41059788f1a3f6f5b3b5b46ac99c72fb51d6f75957ae52a9a67"} Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.417499 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01dd12fc5c70e41059788f1a3f6f5b3b5b46ac99c72fb51d6f75957ae52a9a67" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.417576 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rd5l8" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.525100 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rd5l8"] Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.531031 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rd5l8"] Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.594039 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fv7rm"] Oct 08 19:39:28 crc kubenswrapper[4750]: E1008 19:39:28.594594 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfee363b-c0e2-4747-b197-fff595e23653" containerName="keystone-bootstrap" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.594618 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfee363b-c0e2-4747-b197-fff595e23653" containerName="keystone-bootstrap" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.594877 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfee363b-c0e2-4747-b197-fff595e23653" containerName="keystone-bootstrap" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.595754 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.598097 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.598695 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.602023 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.602199 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lvddj" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.606186 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fv7rm"] Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.620342 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-credential-keys\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.620403 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-combined-ca-bundle\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.620478 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-config-data\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.620573 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7ldf\" (UniqueName: \"kubernetes.io/projected/347b7991-7079-420d-a1a8-1506d1a3ff01-kube-api-access-j7ldf\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.620615 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-fernet-keys\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.620664 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-scripts\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.722953 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-credential-keys\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.723187 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-combined-ca-bundle\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.723318 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-config-data\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.723714 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7ldf\" (UniqueName: \"kubernetes.io/projected/347b7991-7079-420d-a1a8-1506d1a3ff01-kube-api-access-j7ldf\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.723787 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-fernet-keys\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.724507 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-scripts\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.728490 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-config-data\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.729184 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-combined-ca-bundle\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.737393 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-fernet-keys\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.737419 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-scripts\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.737499 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-credential-keys\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.748480 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7ldf\" (UniqueName: \"kubernetes.io/projected/347b7991-7079-420d-a1a8-1506d1a3ff01-kube-api-access-j7ldf\") pod \"keystone-bootstrap-fv7rm\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.758950 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfee363b-c0e2-4747-b197-fff595e23653" path="/var/lib/kubelet/pods/dfee363b-c0e2-4747-b197-fff595e23653/volumes" Oct 08 19:39:28 crc kubenswrapper[4750]: I1008 19:39:28.933905 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:29 crc kubenswrapper[4750]: I1008 19:39:29.465796 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fv7rm"] Oct 08 19:39:30 crc kubenswrapper[4750]: I1008 19:39:30.442082 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fv7rm" event={"ID":"347b7991-7079-420d-a1a8-1506d1a3ff01","Type":"ContainerStarted","Data":"5eaa609b6cd6658b2111b2d08a4af90b0ad1f2769220807377eaddc6bbb500f3"} Oct 08 19:39:30 crc kubenswrapper[4750]: I1008 19:39:30.442136 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fv7rm" event={"ID":"347b7991-7079-420d-a1a8-1506d1a3ff01","Type":"ContainerStarted","Data":"b11e476e2f9dbbfad1736316be2158138cbdd458662575d291e0d828dbe48bb6"} Oct 08 19:39:30 crc kubenswrapper[4750]: I1008 19:39:30.466954 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fv7rm" podStartSLOduration=2.466920876 podStartE2EDuration="2.466920876s" podCreationTimestamp="2025-10-08 19:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:39:30.463634835 +0000 UTC m=+5326.376605858" watchObservedRunningTime="2025-10-08 19:39:30.466920876 +0000 UTC m=+5326.379891959" Oct 08 19:39:31 crc kubenswrapper[4750]: I1008 19:39:31.982858 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.074421 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ffd6fd84f-p46d7"] Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.074744 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" podUID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" containerName="dnsmasq-dns" containerID="cri-o://b0d83c5b5697d86f822999beb851e2fd21d142caac9b119b2127c87dccbab396" gracePeriod=10 Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.465204 4750 generic.go:334] "Generic (PLEG): container finished" podID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" containerID="b0d83c5b5697d86f822999beb851e2fd21d142caac9b119b2127c87dccbab396" exitCode=0 Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.465314 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" event={"ID":"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5","Type":"ContainerDied","Data":"b0d83c5b5697d86f822999beb851e2fd21d142caac9b119b2127c87dccbab396"} Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.649135 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.819428 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-nb\") pod \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.819506 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-sb\") pod \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.819794 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzxxk\" (UniqueName: \"kubernetes.io/projected/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-kube-api-access-fzxxk\") pod \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.819924 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-dns-svc\") pod \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.819969 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-config\") pod \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\" (UID: \"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5\") " Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.829022 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-kube-api-access-fzxxk" (OuterVolumeSpecName: "kube-api-access-fzxxk") pod "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" (UID: "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5"). InnerVolumeSpecName "kube-api-access-fzxxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.865772 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-config" (OuterVolumeSpecName: "config") pod "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" (UID: "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.865811 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" (UID: "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.867007 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" (UID: "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.895962 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" (UID: "a933fd08-d47b-43e6-9bc3-2ad4a439d8d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.922974 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzxxk\" (UniqueName: \"kubernetes.io/projected/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-kube-api-access-fzxxk\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.927018 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.927100 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.927127 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:32 crc kubenswrapper[4750]: I1008 19:39:32.927148 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:33 crc kubenswrapper[4750]: I1008 19:39:33.483161 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" Oct 08 19:39:33 crc kubenswrapper[4750]: I1008 19:39:33.483469 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ffd6fd84f-p46d7" event={"ID":"a933fd08-d47b-43e6-9bc3-2ad4a439d8d5","Type":"ContainerDied","Data":"bd8c10d7a9e4ee40da3e49f3f2f4f7a90068d96257e6285c67152fec1e3d7217"} Oct 08 19:39:33 crc kubenswrapper[4750]: I1008 19:39:33.483591 4750 scope.go:117] "RemoveContainer" containerID="b0d83c5b5697d86f822999beb851e2fd21d142caac9b119b2127c87dccbab396" Oct 08 19:39:33 crc kubenswrapper[4750]: I1008 19:39:33.486058 4750 generic.go:334] "Generic (PLEG): container finished" podID="347b7991-7079-420d-a1a8-1506d1a3ff01" containerID="5eaa609b6cd6658b2111b2d08a4af90b0ad1f2769220807377eaddc6bbb500f3" exitCode=0 Oct 08 19:39:33 crc kubenswrapper[4750]: I1008 19:39:33.486124 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fv7rm" event={"ID":"347b7991-7079-420d-a1a8-1506d1a3ff01","Type":"ContainerDied","Data":"5eaa609b6cd6658b2111b2d08a4af90b0ad1f2769220807377eaddc6bbb500f3"} Oct 08 19:39:33 crc kubenswrapper[4750]: I1008 19:39:33.517527 4750 scope.go:117] "RemoveContainer" containerID="5776ffa4c79ad777ceb9794893e3563e5fa4ed76a5a88fd791c490699dd82f89" Oct 08 19:39:33 crc kubenswrapper[4750]: I1008 19:39:33.542934 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ffd6fd84f-p46d7"] Oct 08 19:39:33 crc kubenswrapper[4750]: I1008 19:39:33.554140 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ffd6fd84f-p46d7"] Oct 08 19:39:34 crc kubenswrapper[4750]: I1008 19:39:34.757724 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" path="/var/lib/kubelet/pods/a933fd08-d47b-43e6-9bc3-2ad4a439d8d5/volumes" Oct 08 19:39:34 crc kubenswrapper[4750]: I1008 19:39:34.912985 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.070147 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-fernet-keys\") pod \"347b7991-7079-420d-a1a8-1506d1a3ff01\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.070279 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-combined-ca-bundle\") pod \"347b7991-7079-420d-a1a8-1506d1a3ff01\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.070365 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-config-data\") pod \"347b7991-7079-420d-a1a8-1506d1a3ff01\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.070495 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-credential-keys\") pod \"347b7991-7079-420d-a1a8-1506d1a3ff01\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.070534 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-scripts\") pod \"347b7991-7079-420d-a1a8-1506d1a3ff01\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.070600 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7ldf\" (UniqueName: \"kubernetes.io/projected/347b7991-7079-420d-a1a8-1506d1a3ff01-kube-api-access-j7ldf\") pod \"347b7991-7079-420d-a1a8-1506d1a3ff01\" (UID: \"347b7991-7079-420d-a1a8-1506d1a3ff01\") " Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.078183 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-scripts" (OuterVolumeSpecName: "scripts") pod "347b7991-7079-420d-a1a8-1506d1a3ff01" (UID: "347b7991-7079-420d-a1a8-1506d1a3ff01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.078501 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "347b7991-7079-420d-a1a8-1506d1a3ff01" (UID: "347b7991-7079-420d-a1a8-1506d1a3ff01"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.078912 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "347b7991-7079-420d-a1a8-1506d1a3ff01" (UID: "347b7991-7079-420d-a1a8-1506d1a3ff01"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.079017 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/347b7991-7079-420d-a1a8-1506d1a3ff01-kube-api-access-j7ldf" (OuterVolumeSpecName: "kube-api-access-j7ldf") pod "347b7991-7079-420d-a1a8-1506d1a3ff01" (UID: "347b7991-7079-420d-a1a8-1506d1a3ff01"). InnerVolumeSpecName "kube-api-access-j7ldf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.105162 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "347b7991-7079-420d-a1a8-1506d1a3ff01" (UID: "347b7991-7079-420d-a1a8-1506d1a3ff01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.114084 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-config-data" (OuterVolumeSpecName: "config-data") pod "347b7991-7079-420d-a1a8-1506d1a3ff01" (UID: "347b7991-7079-420d-a1a8-1506d1a3ff01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.173519 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7ldf\" (UniqueName: \"kubernetes.io/projected/347b7991-7079-420d-a1a8-1506d1a3ff01-kube-api-access-j7ldf\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.173595 4750 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.173610 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.173627 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.173640 4750 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.173653 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/347b7991-7079-420d-a1a8-1506d1a3ff01-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.523139 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fv7rm" event={"ID":"347b7991-7079-420d-a1a8-1506d1a3ff01","Type":"ContainerDied","Data":"b11e476e2f9dbbfad1736316be2158138cbdd458662575d291e0d828dbe48bb6"} Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.523187 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11e476e2f9dbbfad1736316be2158138cbdd458662575d291e0d828dbe48bb6" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.523300 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fv7rm" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.641886 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7747bfd995-l7pz4"] Oct 08 19:39:35 crc kubenswrapper[4750]: E1008 19:39:35.642437 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="347b7991-7079-420d-a1a8-1506d1a3ff01" containerName="keystone-bootstrap" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.642462 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="347b7991-7079-420d-a1a8-1506d1a3ff01" containerName="keystone-bootstrap" Oct 08 19:39:35 crc kubenswrapper[4750]: E1008 19:39:35.642492 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" containerName="dnsmasq-dns" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.642504 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" containerName="dnsmasq-dns" Oct 08 19:39:35 crc kubenswrapper[4750]: E1008 19:39:35.642527 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" containerName="init" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.642535 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" containerName="init" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.642823 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a933fd08-d47b-43e6-9bc3-2ad4a439d8d5" containerName="dnsmasq-dns" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.642853 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="347b7991-7079-420d-a1a8-1506d1a3ff01" containerName="keystone-bootstrap" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.643726 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.647990 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.648107 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.653418 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lvddj" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.654212 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.664260 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7747bfd995-l7pz4"] Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.695048 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-combined-ca-bundle\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.695457 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-config-data\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.695618 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zl4\" (UniqueName: \"kubernetes.io/projected/aa5ad06d-f897-4d65-b17c-c2affae142d6-kube-api-access-c8zl4\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.695656 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-credential-keys\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.695957 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-fernet-keys\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.695993 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-scripts\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.797876 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-fernet-keys\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.797935 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-scripts\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.797981 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-combined-ca-bundle\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.798028 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-config-data\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.798063 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-credential-keys\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.798081 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zl4\" (UniqueName: \"kubernetes.io/projected/aa5ad06d-f897-4d65-b17c-c2affae142d6-kube-api-access-c8zl4\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.802310 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-scripts\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.804087 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-fernet-keys\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.804328 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-combined-ca-bundle\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.805174 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-config-data\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.812746 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5ad06d-f897-4d65-b17c-c2affae142d6-credential-keys\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.818102 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zl4\" (UniqueName: \"kubernetes.io/projected/aa5ad06d-f897-4d65-b17c-c2affae142d6-kube-api-access-c8zl4\") pod \"keystone-7747bfd995-l7pz4\" (UID: \"aa5ad06d-f897-4d65-b17c-c2affae142d6\") " pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:35 crc kubenswrapper[4750]: I1008 19:39:35.969064 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:36 crc kubenswrapper[4750]: I1008 19:39:36.252410 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7747bfd995-l7pz4"] Oct 08 19:39:36 crc kubenswrapper[4750]: W1008 19:39:36.260861 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa5ad06d_f897_4d65_b17c_c2affae142d6.slice/crio-37a0d912ef56adb498938036efe9cdbe2ce56222c606201bdadc8e3f37cf1d87 WatchSource:0}: Error finding container 37a0d912ef56adb498938036efe9cdbe2ce56222c606201bdadc8e3f37cf1d87: Status 404 returned error can't find the container with id 37a0d912ef56adb498938036efe9cdbe2ce56222c606201bdadc8e3f37cf1d87 Oct 08 19:39:36 crc kubenswrapper[4750]: I1008 19:39:36.535715 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7747bfd995-l7pz4" event={"ID":"aa5ad06d-f897-4d65-b17c-c2affae142d6","Type":"ContainerStarted","Data":"5cc39e3dd78ba238908dcf5fd729dca8ed25527cb2995238f2430775148b0772"} Oct 08 19:39:36 crc kubenswrapper[4750]: I1008 19:39:36.536253 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7747bfd995-l7pz4" event={"ID":"aa5ad06d-f897-4d65-b17c-c2affae142d6","Type":"ContainerStarted","Data":"37a0d912ef56adb498938036efe9cdbe2ce56222c606201bdadc8e3f37cf1d87"} Oct 08 19:39:36 crc kubenswrapper[4750]: I1008 19:39:36.536294 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:39:39 crc kubenswrapper[4750]: I1008 19:39:39.736210 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:39:39 crc kubenswrapper[4750]: E1008 19:39:39.737070 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:39:52 crc kubenswrapper[4750]: I1008 19:39:52.734188 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:39:52 crc kubenswrapper[4750]: E1008 19:39:52.735458 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:40:07 crc kubenswrapper[4750]: I1008 19:40:07.483829 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7747bfd995-l7pz4" Oct 08 19:40:07 crc kubenswrapper[4750]: I1008 19:40:07.512501 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7747bfd995-l7pz4" podStartSLOduration=32.512474472 podStartE2EDuration="32.512474472s" podCreationTimestamp="2025-10-08 19:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:39:36.563461281 +0000 UTC m=+5332.476432334" watchObservedRunningTime="2025-10-08 19:40:07.512474472 +0000 UTC m=+5363.425445485" Oct 08 19:40:07 crc kubenswrapper[4750]: I1008 19:40:07.734462 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:40:07 crc kubenswrapper[4750]: E1008 19:40:07.734861 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.079144 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.081281 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.085789 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zcwx5" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.086368 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.086895 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.092099 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.100194 4750 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf30935-3ddf-48cd-9b68-54ea01864338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T19:40:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T19:40:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T19:40:11Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T19:40:11Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:b8bff6857fec93c3c1521f1a8c23de21bcb86fc0f960972e81f6c3f95d4185be\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nqsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T19:40:11Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.124613 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 08 19:40:11 crc kubenswrapper[4750]: E1008 19:40:11.125085 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7nqsq openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-7nqsq openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="7cf30935-3ddf-48cd-9b68-54ea01864338" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.137541 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.153887 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.155736 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.169348 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.184643 4750 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7cf30935-3ddf-48cd-9b68-54ea01864338" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.222105 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98bq8\" (UniqueName: \"kubernetes.io/projected/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-kube-api-access-98bq8\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.222190 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.222230 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config-secret\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.323940 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98bq8\" (UniqueName: \"kubernetes.io/projected/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-kube-api-access-98bq8\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.324013 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.324044 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config-secret\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.325101 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.331057 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config-secret\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.346502 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98bq8\" (UniqueName: \"kubernetes.io/projected/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-kube-api-access-98bq8\") pod \"openstackclient\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.475477 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.928704 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.934036 4750 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7cf30935-3ddf-48cd-9b68-54ea01864338" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.943697 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.947396 4750 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7cf30935-3ddf-48cd-9b68-54ea01864338" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" Oct 08 19:40:11 crc kubenswrapper[4750]: I1008 19:40:11.965970 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 19:40:12 crc kubenswrapper[4750]: I1008 19:40:12.745195 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf30935-3ddf-48cd-9b68-54ea01864338" path="/var/lib/kubelet/pods/7cf30935-3ddf-48cd-9b68-54ea01864338/volumes" Oct 08 19:40:12 crc kubenswrapper[4750]: I1008 19:40:12.939909 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:40:12 crc kubenswrapper[4750]: I1008 19:40:12.939912 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee","Type":"ContainerStarted","Data":"f2c3e83659946de46e30c1fcf132ef19fcf9fd026c254bd1a157559c4becf3a5"} Oct 08 19:40:12 crc kubenswrapper[4750]: I1008 19:40:12.940355 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee","Type":"ContainerStarted","Data":"4b20875f1aa1a5271cfaabc349e8954a1839fb5a83c2298667cb443da2f00412"} Oct 08 19:40:12 crc kubenswrapper[4750]: I1008 19:40:12.957878 4750 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7cf30935-3ddf-48cd-9b68-54ea01864338" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" Oct 08 19:40:12 crc kubenswrapper[4750]: I1008 19:40:12.964806 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.96477758 podStartE2EDuration="1.96477758s" podCreationTimestamp="2025-10-08 19:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:40:12.953606422 +0000 UTC m=+5368.866577445" watchObservedRunningTime="2025-10-08 19:40:12.96477758 +0000 UTC m=+5368.877748593" Oct 08 19:40:19 crc kubenswrapper[4750]: I1008 19:40:19.734393 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:40:19 crc kubenswrapper[4750]: E1008 19:40:19.735660 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:40:31 crc kubenswrapper[4750]: I1008 19:40:31.734244 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:40:31 crc kubenswrapper[4750]: E1008 19:40:31.734999 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:40:46 crc kubenswrapper[4750]: I1008 19:40:46.735221 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:40:46 crc kubenswrapper[4750]: E1008 19:40:46.736535 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:40:58 crc kubenswrapper[4750]: I1008 19:40:58.734695 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:40:58 crc kubenswrapper[4750]: E1008 19:40:58.735479 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:41:12 crc kubenswrapper[4750]: I1008 19:41:12.735002 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:41:13 crc kubenswrapper[4750]: I1008 19:41:13.559381 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"1499c5346a2a6057a03d90e92d9fde974f96055257fa4f56cf44bb72d0e5bf8e"} Oct 08 19:41:26 crc kubenswrapper[4750]: E1008 19:41:26.522305 4750 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.75:54566->38.102.83.75:34167: write tcp 38.102.83.75:54566->38.102.83.75:34167: write: broken pipe Oct 08 19:42:01 crc kubenswrapper[4750]: I1008 19:42:01.798901 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-gnlng"] Oct 08 19:42:01 crc kubenswrapper[4750]: I1008 19:42:01.801039 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gnlng" Oct 08 19:42:01 crc kubenswrapper[4750]: I1008 19:42:01.821091 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gnlng"] Oct 08 19:42:01 crc kubenswrapper[4750]: I1008 19:42:01.865239 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6hm5\" (UniqueName: \"kubernetes.io/projected/4bf32b68-0825-4468-b0b5-e4eb2eb7d10f-kube-api-access-h6hm5\") pod \"barbican-db-create-gnlng\" (UID: \"4bf32b68-0825-4468-b0b5-e4eb2eb7d10f\") " pod="openstack/barbican-db-create-gnlng" Oct 08 19:42:01 crc kubenswrapper[4750]: I1008 19:42:01.968075 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6hm5\" (UniqueName: \"kubernetes.io/projected/4bf32b68-0825-4468-b0b5-e4eb2eb7d10f-kube-api-access-h6hm5\") pod \"barbican-db-create-gnlng\" (UID: \"4bf32b68-0825-4468-b0b5-e4eb2eb7d10f\") " pod="openstack/barbican-db-create-gnlng" Oct 08 19:42:02 crc kubenswrapper[4750]: I1008 19:42:02.000022 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6hm5\" (UniqueName: \"kubernetes.io/projected/4bf32b68-0825-4468-b0b5-e4eb2eb7d10f-kube-api-access-h6hm5\") pod \"barbican-db-create-gnlng\" (UID: \"4bf32b68-0825-4468-b0b5-e4eb2eb7d10f\") " pod="openstack/barbican-db-create-gnlng" Oct 08 19:42:02 crc kubenswrapper[4750]: I1008 19:42:02.125234 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gnlng" Oct 08 19:42:02 crc kubenswrapper[4750]: I1008 19:42:02.588341 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gnlng"] Oct 08 19:42:03 crc kubenswrapper[4750]: I1008 19:42:03.093599 4750 generic.go:334] "Generic (PLEG): container finished" podID="4bf32b68-0825-4468-b0b5-e4eb2eb7d10f" containerID="d99b74b3c8d304c97e5b923cfa35a88af1e44b60c0a35db6166dec217de8ae30" exitCode=0 Oct 08 19:42:03 crc kubenswrapper[4750]: I1008 19:42:03.093757 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gnlng" event={"ID":"4bf32b68-0825-4468-b0b5-e4eb2eb7d10f","Type":"ContainerDied","Data":"d99b74b3c8d304c97e5b923cfa35a88af1e44b60c0a35db6166dec217de8ae30"} Oct 08 19:42:03 crc kubenswrapper[4750]: I1008 19:42:03.094793 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gnlng" event={"ID":"4bf32b68-0825-4468-b0b5-e4eb2eb7d10f","Type":"ContainerStarted","Data":"062dacc6f30a33332947884eeb94e643637b38a55db6ee7dcd3c6281be8f0734"} Oct 08 19:42:04 crc kubenswrapper[4750]: I1008 19:42:04.549359 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gnlng" Oct 08 19:42:04 crc kubenswrapper[4750]: I1008 19:42:04.722320 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6hm5\" (UniqueName: \"kubernetes.io/projected/4bf32b68-0825-4468-b0b5-e4eb2eb7d10f-kube-api-access-h6hm5\") pod \"4bf32b68-0825-4468-b0b5-e4eb2eb7d10f\" (UID: \"4bf32b68-0825-4468-b0b5-e4eb2eb7d10f\") " Oct 08 19:42:04 crc kubenswrapper[4750]: I1008 19:42:04.733028 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bf32b68-0825-4468-b0b5-e4eb2eb7d10f-kube-api-access-h6hm5" (OuterVolumeSpecName: "kube-api-access-h6hm5") pod "4bf32b68-0825-4468-b0b5-e4eb2eb7d10f" (UID: "4bf32b68-0825-4468-b0b5-e4eb2eb7d10f"). InnerVolumeSpecName "kube-api-access-h6hm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:42:04 crc kubenswrapper[4750]: I1008 19:42:04.824590 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6hm5\" (UniqueName: \"kubernetes.io/projected/4bf32b68-0825-4468-b0b5-e4eb2eb7d10f-kube-api-access-h6hm5\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:05 crc kubenswrapper[4750]: I1008 19:42:05.126792 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gnlng" event={"ID":"4bf32b68-0825-4468-b0b5-e4eb2eb7d10f","Type":"ContainerDied","Data":"062dacc6f30a33332947884eeb94e643637b38a55db6ee7dcd3c6281be8f0734"} Oct 08 19:42:05 crc kubenswrapper[4750]: I1008 19:42:05.127321 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="062dacc6f30a33332947884eeb94e643637b38a55db6ee7dcd3c6281be8f0734" Oct 08 19:42:05 crc kubenswrapper[4750]: I1008 19:42:05.126904 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gnlng" Oct 08 19:42:11 crc kubenswrapper[4750]: I1008 19:42:11.811334 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2123-account-create-n546k"] Oct 08 19:42:11 crc kubenswrapper[4750]: E1008 19:42:11.812665 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bf32b68-0825-4468-b0b5-e4eb2eb7d10f" containerName="mariadb-database-create" Oct 08 19:42:11 crc kubenswrapper[4750]: I1008 19:42:11.812684 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf32b68-0825-4468-b0b5-e4eb2eb7d10f" containerName="mariadb-database-create" Oct 08 19:42:11 crc kubenswrapper[4750]: I1008 19:42:11.812943 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bf32b68-0825-4468-b0b5-e4eb2eb7d10f" containerName="mariadb-database-create" Oct 08 19:42:11 crc kubenswrapper[4750]: I1008 19:42:11.813736 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2123-account-create-n546k" Oct 08 19:42:11 crc kubenswrapper[4750]: I1008 19:42:11.824214 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 08 19:42:11 crc kubenswrapper[4750]: I1008 19:42:11.833237 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2123-account-create-n546k"] Oct 08 19:42:11 crc kubenswrapper[4750]: I1008 19:42:11.894043 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfjl\" (UniqueName: \"kubernetes.io/projected/97ab5e96-40ba-4396-99bf-f77ecb9489f2-kube-api-access-vhfjl\") pod \"barbican-2123-account-create-n546k\" (UID: \"97ab5e96-40ba-4396-99bf-f77ecb9489f2\") " pod="openstack/barbican-2123-account-create-n546k" Oct 08 19:42:11 crc kubenswrapper[4750]: I1008 19:42:11.996708 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfjl\" (UniqueName: \"kubernetes.io/projected/97ab5e96-40ba-4396-99bf-f77ecb9489f2-kube-api-access-vhfjl\") pod \"barbican-2123-account-create-n546k\" (UID: \"97ab5e96-40ba-4396-99bf-f77ecb9489f2\") " pod="openstack/barbican-2123-account-create-n546k" Oct 08 19:42:12 crc kubenswrapper[4750]: I1008 19:42:12.017195 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfjl\" (UniqueName: \"kubernetes.io/projected/97ab5e96-40ba-4396-99bf-f77ecb9489f2-kube-api-access-vhfjl\") pod \"barbican-2123-account-create-n546k\" (UID: \"97ab5e96-40ba-4396-99bf-f77ecb9489f2\") " pod="openstack/barbican-2123-account-create-n546k" Oct 08 19:42:12 crc kubenswrapper[4750]: I1008 19:42:12.152494 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2123-account-create-n546k" Oct 08 19:42:12 crc kubenswrapper[4750]: I1008 19:42:12.630826 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2123-account-create-n546k"] Oct 08 19:42:13 crc kubenswrapper[4750]: I1008 19:42:13.221624 4750 generic.go:334] "Generic (PLEG): container finished" podID="97ab5e96-40ba-4396-99bf-f77ecb9489f2" containerID="345952453cb80c23c553ab767e9f173f3e6172cb77539869f6b5805ed5a53682" exitCode=0 Oct 08 19:42:13 crc kubenswrapper[4750]: I1008 19:42:13.221712 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2123-account-create-n546k" event={"ID":"97ab5e96-40ba-4396-99bf-f77ecb9489f2","Type":"ContainerDied","Data":"345952453cb80c23c553ab767e9f173f3e6172cb77539869f6b5805ed5a53682"} Oct 08 19:42:13 crc kubenswrapper[4750]: I1008 19:42:13.221767 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2123-account-create-n546k" event={"ID":"97ab5e96-40ba-4396-99bf-f77ecb9489f2","Type":"ContainerStarted","Data":"340d7f36fb51666e1f4d0dcb5d48f409aaec5e8bb9e0a9999f0050e6ca2eb370"} Oct 08 19:42:14 crc kubenswrapper[4750]: I1008 19:42:14.601525 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2123-account-create-n546k" Oct 08 19:42:14 crc kubenswrapper[4750]: I1008 19:42:14.669608 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhfjl\" (UniqueName: \"kubernetes.io/projected/97ab5e96-40ba-4396-99bf-f77ecb9489f2-kube-api-access-vhfjl\") pod \"97ab5e96-40ba-4396-99bf-f77ecb9489f2\" (UID: \"97ab5e96-40ba-4396-99bf-f77ecb9489f2\") " Oct 08 19:42:14 crc kubenswrapper[4750]: I1008 19:42:14.676924 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ab5e96-40ba-4396-99bf-f77ecb9489f2-kube-api-access-vhfjl" (OuterVolumeSpecName: "kube-api-access-vhfjl") pod "97ab5e96-40ba-4396-99bf-f77ecb9489f2" (UID: "97ab5e96-40ba-4396-99bf-f77ecb9489f2"). InnerVolumeSpecName "kube-api-access-vhfjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:42:14 crc kubenswrapper[4750]: I1008 19:42:14.772023 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhfjl\" (UniqueName: \"kubernetes.io/projected/97ab5e96-40ba-4396-99bf-f77ecb9489f2-kube-api-access-vhfjl\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:15 crc kubenswrapper[4750]: I1008 19:42:15.255223 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2123-account-create-n546k" event={"ID":"97ab5e96-40ba-4396-99bf-f77ecb9489f2","Type":"ContainerDied","Data":"340d7f36fb51666e1f4d0dcb5d48f409aaec5e8bb9e0a9999f0050e6ca2eb370"} Oct 08 19:42:15 crc kubenswrapper[4750]: I1008 19:42:15.256183 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="340d7f36fb51666e1f4d0dcb5d48f409aaec5e8bb9e0a9999f0050e6ca2eb370" Oct 08 19:42:15 crc kubenswrapper[4750]: I1008 19:42:15.255480 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2123-account-create-n546k" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.042437 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fndk5"] Oct 08 19:42:17 crc kubenswrapper[4750]: E1008 19:42:17.043693 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ab5e96-40ba-4396-99bf-f77ecb9489f2" containerName="mariadb-account-create" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.043722 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ab5e96-40ba-4396-99bf-f77ecb9489f2" containerName="mariadb-account-create" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.044044 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ab5e96-40ba-4396-99bf-f77ecb9489f2" containerName="mariadb-account-create" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.045156 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.051373 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.052293 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jtq4j" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.063631 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fndk5"] Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.123837 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-combined-ca-bundle\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.123984 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mppbj\" (UniqueName: \"kubernetes.io/projected/8eae594e-7b85-4ec1-b9ca-5ab025352efa-kube-api-access-mppbj\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.124338 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-db-sync-config-data\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.227301 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-combined-ca-bundle\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.227571 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mppbj\" (UniqueName: \"kubernetes.io/projected/8eae594e-7b85-4ec1-b9ca-5ab025352efa-kube-api-access-mppbj\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.227661 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-db-sync-config-data\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.233197 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-db-sync-config-data\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.246714 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-combined-ca-bundle\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.251110 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mppbj\" (UniqueName: \"kubernetes.io/projected/8eae594e-7b85-4ec1-b9ca-5ab025352efa-kube-api-access-mppbj\") pod \"barbican-db-sync-fndk5\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.377334 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:17 crc kubenswrapper[4750]: I1008 19:42:17.857540 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fndk5"] Oct 08 19:42:18 crc kubenswrapper[4750]: I1008 19:42:18.289318 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fndk5" event={"ID":"8eae594e-7b85-4ec1-b9ca-5ab025352efa","Type":"ContainerStarted","Data":"9848e83ea6f9a85bab38f4fceb1da300af516cddef7a398f1a26ee1270e3e863"} Oct 08 19:42:18 crc kubenswrapper[4750]: I1008 19:42:18.289942 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fndk5" event={"ID":"8eae594e-7b85-4ec1-b9ca-5ab025352efa","Type":"ContainerStarted","Data":"22ffe3cef1ca0a902b52f566a2b5d05e8f7db78e76e1cd9bc771c1547b367725"} Oct 08 19:42:18 crc kubenswrapper[4750]: I1008 19:42:18.310053 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fndk5" podStartSLOduration=1.31002159 podStartE2EDuration="1.31002159s" podCreationTimestamp="2025-10-08 19:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:42:18.303885508 +0000 UTC m=+5494.216856521" watchObservedRunningTime="2025-10-08 19:42:18.31002159 +0000 UTC m=+5494.222992613" Oct 08 19:42:20 crc kubenswrapper[4750]: I1008 19:42:20.315672 4750 generic.go:334] "Generic (PLEG): container finished" podID="8eae594e-7b85-4ec1-b9ca-5ab025352efa" containerID="9848e83ea6f9a85bab38f4fceb1da300af516cddef7a398f1a26ee1270e3e863" exitCode=0 Oct 08 19:42:20 crc kubenswrapper[4750]: I1008 19:42:20.315854 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fndk5" event={"ID":"8eae594e-7b85-4ec1-b9ca-5ab025352efa","Type":"ContainerDied","Data":"9848e83ea6f9a85bab38f4fceb1da300af516cddef7a398f1a26ee1270e3e863"} Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.703155 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.824874 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mppbj\" (UniqueName: \"kubernetes.io/projected/8eae594e-7b85-4ec1-b9ca-5ab025352efa-kube-api-access-mppbj\") pod \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.825227 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-combined-ca-bundle\") pod \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.825409 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-db-sync-config-data\") pod \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\" (UID: \"8eae594e-7b85-4ec1-b9ca-5ab025352efa\") " Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.842633 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8eae594e-7b85-4ec1-b9ca-5ab025352efa" (UID: "8eae594e-7b85-4ec1-b9ca-5ab025352efa"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.846705 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eae594e-7b85-4ec1-b9ca-5ab025352efa-kube-api-access-mppbj" (OuterVolumeSpecName: "kube-api-access-mppbj") pod "8eae594e-7b85-4ec1-b9ca-5ab025352efa" (UID: "8eae594e-7b85-4ec1-b9ca-5ab025352efa"). InnerVolumeSpecName "kube-api-access-mppbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.851521 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eae594e-7b85-4ec1-b9ca-5ab025352efa" (UID: "8eae594e-7b85-4ec1-b9ca-5ab025352efa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.927887 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mppbj\" (UniqueName: \"kubernetes.io/projected/8eae594e-7b85-4ec1-b9ca-5ab025352efa-kube-api-access-mppbj\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.928149 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:21 crc kubenswrapper[4750]: I1008 19:42:21.928233 4750 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8eae594e-7b85-4ec1-b9ca-5ab025352efa-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.335749 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fndk5" event={"ID":"8eae594e-7b85-4ec1-b9ca-5ab025352efa","Type":"ContainerDied","Data":"22ffe3cef1ca0a902b52f566a2b5d05e8f7db78e76e1cd9bc771c1547b367725"} Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.335799 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22ffe3cef1ca0a902b52f566a2b5d05e8f7db78e76e1cd9bc771c1547b367725" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.335823 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fndk5" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.594997 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-56986ffc7d-j59g6"] Oct 08 19:42:22 crc kubenswrapper[4750]: E1008 19:42:22.595780 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eae594e-7b85-4ec1-b9ca-5ab025352efa" containerName="barbican-db-sync" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.595800 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eae594e-7b85-4ec1-b9ca-5ab025352efa" containerName="barbican-db-sync" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.595986 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eae594e-7b85-4ec1-b9ca-5ab025352efa" containerName="barbican-db-sync" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.597005 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.603433 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.603532 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jtq4j" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.603745 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.641666 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-combined-ca-bundle\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.641715 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-config-data-custom\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.641757 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-config-data\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.641795 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48760c39-72f8-4889-9075-b2576d3f0209-logs\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.641842 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm595\" (UniqueName: \"kubernetes.io/projected/48760c39-72f8-4889-9075-b2576d3f0209-kube-api-access-tm595\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.642774 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6f5fbc99ff-n2gwg"] Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.644648 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.650983 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.659170 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56986ffc7d-j59g6"] Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.681420 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f5fbc99ff-n2gwg"] Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743489 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-config-data-custom\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743622 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm595\" (UniqueName: \"kubernetes.io/projected/48760c39-72f8-4889-9075-b2576d3f0209-kube-api-access-tm595\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743673 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-combined-ca-bundle\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743708 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69865ad9-63c1-4f81-9873-4aa359bea376-logs\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743767 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-config-data\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743802 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-combined-ca-bundle\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743839 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-config-data-custom\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743875 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-config-data\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743904 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4p97\" (UniqueName: \"kubernetes.io/projected/69865ad9-63c1-4f81-9873-4aa359bea376-kube-api-access-w4p97\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.743934 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48760c39-72f8-4889-9075-b2576d3f0209-logs\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.744591 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48760c39-72f8-4889-9075-b2576d3f0209-logs\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.753302 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-config-data\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.754342 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8cbfd88b5-wzct4"] Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.757290 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-config-data-custom\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.759716 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.762741 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cbfd88b5-wzct4"] Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.777059 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm595\" (UniqueName: \"kubernetes.io/projected/48760c39-72f8-4889-9075-b2576d3f0209-kube-api-access-tm595\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.785131 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48760c39-72f8-4889-9075-b2576d3f0209-combined-ca-bundle\") pod \"barbican-keystone-listener-56986ffc7d-j59g6\" (UID: \"48760c39-72f8-4889-9075-b2576d3f0209\") " pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846536 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-config-data-custom\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846618 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-config\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846655 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6tv8\" (UniqueName: \"kubernetes.io/projected/4db79ea6-2794-442e-9137-255fd42641ad-kube-api-access-x6tv8\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846718 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-combined-ca-bundle\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846745 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69865ad9-63c1-4f81-9873-4aa359bea376-logs\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846785 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-config-data\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846810 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846828 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846853 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-dns-svc\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.846889 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4p97\" (UniqueName: \"kubernetes.io/projected/69865ad9-63c1-4f81-9873-4aa359bea376-kube-api-access-w4p97\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.848952 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69865ad9-63c1-4f81-9873-4aa359bea376-logs\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.862081 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-config-data-custom\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.862130 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-config-data\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.870286 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69865ad9-63c1-4f81-9873-4aa359bea376-combined-ca-bundle\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.875192 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4p97\" (UniqueName: \"kubernetes.io/projected/69865ad9-63c1-4f81-9873-4aa359bea376-kube-api-access-w4p97\") pod \"barbican-worker-6f5fbc99ff-n2gwg\" (UID: \"69865ad9-63c1-4f81-9873-4aa359bea376\") " pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.944363 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.952210 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.952280 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.952318 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-dns-svc\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.952379 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-config\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.952420 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6tv8\" (UniqueName: \"kubernetes.io/projected/4db79ea6-2794-442e-9137-255fd42641ad-kube-api-access-x6tv8\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.953739 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.954344 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-dns-svc\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.954389 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.955737 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56766f6486-vnhn9"] Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.959176 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.961154 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-config\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.962864 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.964394 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.977486 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56766f6486-vnhn9"] Oct 08 19:42:22 crc kubenswrapper[4750]: I1008 19:42:22.985925 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6tv8\" (UniqueName: \"kubernetes.io/projected/4db79ea6-2794-442e-9137-255fd42641ad-kube-api-access-x6tv8\") pod \"dnsmasq-dns-8cbfd88b5-wzct4\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.054434 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-config-data-custom\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.054504 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnfj\" (UniqueName: \"kubernetes.io/projected/f5045303-b61d-4680-bacd-47dad8064038-kube-api-access-nrnfj\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.054586 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-combined-ca-bundle\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.054619 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5045303-b61d-4680-bacd-47dad8064038-logs\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.055017 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-config-data\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.157525 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.158436 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-config-data-custom\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.158561 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnfj\" (UniqueName: \"kubernetes.io/projected/f5045303-b61d-4680-bacd-47dad8064038-kube-api-access-nrnfj\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.158622 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-combined-ca-bundle\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.158657 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5045303-b61d-4680-bacd-47dad8064038-logs\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.158760 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-config-data\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.159538 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5045303-b61d-4680-bacd-47dad8064038-logs\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.166284 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-config-data-custom\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.169782 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-combined-ca-bundle\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.185307 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5045303-b61d-4680-bacd-47dad8064038-config-data\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.193680 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnfj\" (UniqueName: \"kubernetes.io/projected/f5045303-b61d-4680-bacd-47dad8064038-kube-api-access-nrnfj\") pod \"barbican-api-56766f6486-vnhn9\" (UID: \"f5045303-b61d-4680-bacd-47dad8064038\") " pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.335276 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.503043 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f5fbc99ff-n2gwg"] Oct 08 19:42:23 crc kubenswrapper[4750]: W1008 19:42:23.525909 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69865ad9_63c1_4f81_9873_4aa359bea376.slice/crio-101ae0669eb54fc8ede6300837935a4d074e6c3265bb2c3b97849065d22f6f30 WatchSource:0}: Error finding container 101ae0669eb54fc8ede6300837935a4d074e6c3265bb2c3b97849065d22f6f30: Status 404 returned error can't find the container with id 101ae0669eb54fc8ede6300837935a4d074e6c3265bb2c3b97849065d22f6f30 Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.670407 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56986ffc7d-j59g6"] Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.721576 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56766f6486-vnhn9"] Oct 08 19:42:23 crc kubenswrapper[4750]: I1008 19:42:23.740082 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cbfd88b5-wzct4"] Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.357826 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" event={"ID":"69865ad9-63c1-4f81-9873-4aa359bea376","Type":"ContainerStarted","Data":"cb1f7a3c5d74402190712c27af8610681a3ecef2a75b58fefe1c35f00bfe74c3"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.358423 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" event={"ID":"69865ad9-63c1-4f81-9873-4aa359bea376","Type":"ContainerStarted","Data":"b1150d146f227e271ad2120589586eb830fcd23dc422942d22bb3bdea8967f8e"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.358464 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" event={"ID":"69865ad9-63c1-4f81-9873-4aa359bea376","Type":"ContainerStarted","Data":"101ae0669eb54fc8ede6300837935a4d074e6c3265bb2c3b97849065d22f6f30"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.360678 4750 generic.go:334] "Generic (PLEG): container finished" podID="4db79ea6-2794-442e-9137-255fd42641ad" containerID="c53045eca0cd6eb7ca4b9404c791b4c8077a2d1b1598174f5d2a930cedd8c68b" exitCode=0 Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.360761 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" event={"ID":"4db79ea6-2794-442e-9137-255fd42641ad","Type":"ContainerDied","Data":"c53045eca0cd6eb7ca4b9404c791b4c8077a2d1b1598174f5d2a930cedd8c68b"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.360796 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" event={"ID":"4db79ea6-2794-442e-9137-255fd42641ad","Type":"ContainerStarted","Data":"9deecc194d9fda126c3b372e08924838e3368327953158004752aff03edd2a74"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.367768 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" event={"ID":"48760c39-72f8-4889-9075-b2576d3f0209","Type":"ContainerStarted","Data":"e7d3eba333d06e9ea6589027836463cf13a5dfa01aad0f1e7a19b62996ada1c4"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.367846 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" event={"ID":"48760c39-72f8-4889-9075-b2576d3f0209","Type":"ContainerStarted","Data":"1dc8152881abc6a8b520c4ec2ac202c53930e14a6d71679de209d3f968b3eb97"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.367862 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" event={"ID":"48760c39-72f8-4889-9075-b2576d3f0209","Type":"ContainerStarted","Data":"2a75a245871b3ce4ac3ce9017c573fb897da9d6bd5a748383dd2b9c383baac07"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.377887 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56766f6486-vnhn9" event={"ID":"f5045303-b61d-4680-bacd-47dad8064038","Type":"ContainerStarted","Data":"6ed2f78c4aec49cba4fea7e450509b673e3fc57f195ab406754ca710b88a3eaa"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.377964 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56766f6486-vnhn9" event={"ID":"f5045303-b61d-4680-bacd-47dad8064038","Type":"ContainerStarted","Data":"5a286396465f3ed4843515bea613cadb5bf003986b945bfb63004c984a374f48"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.377978 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56766f6486-vnhn9" event={"ID":"f5045303-b61d-4680-bacd-47dad8064038","Type":"ContainerStarted","Data":"28d9e82205fed1b0b30d001c75d948134d39654ab2be1fc76d8559405b4212a1"} Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.378103 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.396219 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6f5fbc99ff-n2gwg" podStartSLOduration=2.396199106 podStartE2EDuration="2.396199106s" podCreationTimestamp="2025-10-08 19:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:42:24.391284144 +0000 UTC m=+5500.304255157" watchObservedRunningTime="2025-10-08 19:42:24.396199106 +0000 UTC m=+5500.309170119" Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.430195 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-56986ffc7d-j59g6" podStartSLOduration=2.430164312 podStartE2EDuration="2.430164312s" podCreationTimestamp="2025-10-08 19:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:42:24.425360453 +0000 UTC m=+5500.338331476" watchObservedRunningTime="2025-10-08 19:42:24.430164312 +0000 UTC m=+5500.343135315" Oct 08 19:42:24 crc kubenswrapper[4750]: I1008 19:42:24.503990 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56766f6486-vnhn9" podStartSLOduration=2.50387965 podStartE2EDuration="2.50387965s" podCreationTimestamp="2025-10-08 19:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:42:24.45417098 +0000 UTC m=+5500.367142003" watchObservedRunningTime="2025-10-08 19:42:24.50387965 +0000 UTC m=+5500.416850663" Oct 08 19:42:25 crc kubenswrapper[4750]: I1008 19:42:25.390735 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" event={"ID":"4db79ea6-2794-442e-9137-255fd42641ad","Type":"ContainerStarted","Data":"522b3161de3b7c218b9e4cf7c883e0256eab0b803c7415d22f4513858a819244"} Oct 08 19:42:25 crc kubenswrapper[4750]: I1008 19:42:25.392769 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:25 crc kubenswrapper[4750]: I1008 19:42:25.392823 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:25 crc kubenswrapper[4750]: I1008 19:42:25.414168 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" podStartSLOduration=3.414148875 podStartE2EDuration="3.414148875s" podCreationTimestamp="2025-10-08 19:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:42:25.41035215 +0000 UTC m=+5501.323323163" watchObservedRunningTime="2025-10-08 19:42:25.414148875 +0000 UTC m=+5501.327119888" Oct 08 19:42:33 crc kubenswrapper[4750]: I1008 19:42:33.160888 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:42:33 crc kubenswrapper[4750]: I1008 19:42:33.265732 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f55cbcff-gwv8t"] Oct 08 19:42:33 crc kubenswrapper[4750]: I1008 19:42:33.266059 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" podUID="2f363218-3c2f-46a8-84cc-985e5db78d56" containerName="dnsmasq-dns" containerID="cri-o://ccab34ae3e95c9a57ecf972c84073ab27ab5f560b07f7235b59fce97d22f10a9" gracePeriod=10 Oct 08 19:42:33 crc kubenswrapper[4750]: I1008 19:42:33.495702 4750 generic.go:334] "Generic (PLEG): container finished" podID="2f363218-3c2f-46a8-84cc-985e5db78d56" containerID="ccab34ae3e95c9a57ecf972c84073ab27ab5f560b07f7235b59fce97d22f10a9" exitCode=0 Oct 08 19:42:33 crc kubenswrapper[4750]: I1008 19:42:33.495902 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" event={"ID":"2f363218-3c2f-46a8-84cc-985e5db78d56","Type":"ContainerDied","Data":"ccab34ae3e95c9a57ecf972c84073ab27ab5f560b07f7235b59fce97d22f10a9"} Oct 08 19:42:33 crc kubenswrapper[4750]: I1008 19:42:33.948254 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.104806 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2js8m\" (UniqueName: \"kubernetes.io/projected/2f363218-3c2f-46a8-84cc-985e5db78d56-kube-api-access-2js8m\") pod \"2f363218-3c2f-46a8-84cc-985e5db78d56\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.104960 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-config\") pod \"2f363218-3c2f-46a8-84cc-985e5db78d56\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.104998 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-dns-svc\") pod \"2f363218-3c2f-46a8-84cc-985e5db78d56\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.105026 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-sb\") pod \"2f363218-3c2f-46a8-84cc-985e5db78d56\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.105068 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-nb\") pod \"2f363218-3c2f-46a8-84cc-985e5db78d56\" (UID: \"2f363218-3c2f-46a8-84cc-985e5db78d56\") " Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.124567 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f363218-3c2f-46a8-84cc-985e5db78d56-kube-api-access-2js8m" (OuterVolumeSpecName: "kube-api-access-2js8m") pod "2f363218-3c2f-46a8-84cc-985e5db78d56" (UID: "2f363218-3c2f-46a8-84cc-985e5db78d56"). InnerVolumeSpecName "kube-api-access-2js8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.149627 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f363218-3c2f-46a8-84cc-985e5db78d56" (UID: "2f363218-3c2f-46a8-84cc-985e5db78d56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.163121 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f363218-3c2f-46a8-84cc-985e5db78d56" (UID: "2f363218-3c2f-46a8-84cc-985e5db78d56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.171241 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-config" (OuterVolumeSpecName: "config") pod "2f363218-3c2f-46a8-84cc-985e5db78d56" (UID: "2f363218-3c2f-46a8-84cc-985e5db78d56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.172073 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f363218-3c2f-46a8-84cc-985e5db78d56" (UID: "2f363218-3c2f-46a8-84cc-985e5db78d56"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.209246 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2js8m\" (UniqueName: \"kubernetes.io/projected/2f363218-3c2f-46a8-84cc-985e5db78d56-kube-api-access-2js8m\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.209291 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.209301 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.209312 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.209321 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f363218-3c2f-46a8-84cc-985e5db78d56-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.509778 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" event={"ID":"2f363218-3c2f-46a8-84cc-985e5db78d56","Type":"ContainerDied","Data":"5847ed4245d97e1690df9f5d66523882c8e26934f589df0ed1c45fd68f5f10ae"} Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.509869 4750 scope.go:117] "RemoveContainer" containerID="ccab34ae3e95c9a57ecf972c84073ab27ab5f560b07f7235b59fce97d22f10a9" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.509907 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f55cbcff-gwv8t" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.547672 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f55cbcff-gwv8t"] Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.547781 4750 scope.go:117] "RemoveContainer" containerID="3378856a0503ed5a92bfcede45ce6e23a7d3cb5780b394041d241c8e585d742a" Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.558506 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77f55cbcff-gwv8t"] Oct 08 19:42:34 crc kubenswrapper[4750]: I1008 19:42:34.771747 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f363218-3c2f-46a8-84cc-985e5db78d56" path="/var/lib/kubelet/pods/2f363218-3c2f-46a8-84cc-985e5db78d56/volumes" Oct 08 19:42:35 crc kubenswrapper[4750]: I1008 19:42:35.032868 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:35 crc kubenswrapper[4750]: I1008 19:42:35.212114 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56766f6486-vnhn9" Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.773113 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l297w"] Oct 08 19:42:39 crc kubenswrapper[4750]: E1008 19:42:39.776485 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f363218-3c2f-46a8-84cc-985e5db78d56" containerName="init" Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.776508 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f363218-3c2f-46a8-84cc-985e5db78d56" containerName="init" Oct 08 19:42:39 crc kubenswrapper[4750]: E1008 19:42:39.776573 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f363218-3c2f-46a8-84cc-985e5db78d56" containerName="dnsmasq-dns" Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.776583 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f363218-3c2f-46a8-84cc-985e5db78d56" containerName="dnsmasq-dns" Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.777034 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f363218-3c2f-46a8-84cc-985e5db78d56" containerName="dnsmasq-dns" Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.782136 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.792877 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l297w"] Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.922123 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-catalog-content\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.922802 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-utilities\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:39 crc kubenswrapper[4750]: I1008 19:42:39.922907 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9qs\" (UniqueName: \"kubernetes.io/projected/86cd538e-e4cb-426b-afe9-a151b52f518a-kube-api-access-7b9qs\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:40 crc kubenswrapper[4750]: I1008 19:42:40.024984 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-catalog-content\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:40 crc kubenswrapper[4750]: I1008 19:42:40.025046 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-utilities\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:40 crc kubenswrapper[4750]: I1008 19:42:40.025105 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9qs\" (UniqueName: \"kubernetes.io/projected/86cd538e-e4cb-426b-afe9-a151b52f518a-kube-api-access-7b9qs\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:40 crc kubenswrapper[4750]: I1008 19:42:40.026104 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-utilities\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:40 crc kubenswrapper[4750]: I1008 19:42:40.026358 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-catalog-content\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:40 crc kubenswrapper[4750]: I1008 19:42:40.055238 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9qs\" (UniqueName: \"kubernetes.io/projected/86cd538e-e4cb-426b-afe9-a151b52f518a-kube-api-access-7b9qs\") pod \"certified-operators-l297w\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:40 crc kubenswrapper[4750]: I1008 19:42:40.107820 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:40 crc kubenswrapper[4750]: I1008 19:42:40.645861 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l297w"] Oct 08 19:42:41 crc kubenswrapper[4750]: I1008 19:42:41.609527 4750 generic.go:334] "Generic (PLEG): container finished" podID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerID="804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4" exitCode=0 Oct 08 19:42:41 crc kubenswrapper[4750]: I1008 19:42:41.609665 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l297w" event={"ID":"86cd538e-e4cb-426b-afe9-a151b52f518a","Type":"ContainerDied","Data":"804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4"} Oct 08 19:42:41 crc kubenswrapper[4750]: I1008 19:42:41.609999 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l297w" event={"ID":"86cd538e-e4cb-426b-afe9-a151b52f518a","Type":"ContainerStarted","Data":"14ad8c0a171524c555e22035ff17a39d7ecb4a13384d4e334192b40f7979f582"} Oct 08 19:42:41 crc kubenswrapper[4750]: I1008 19:42:41.612589 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:42:43 crc kubenswrapper[4750]: I1008 19:42:43.633530 4750 generic.go:334] "Generic (PLEG): container finished" podID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerID="bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea" exitCode=0 Oct 08 19:42:43 crc kubenswrapper[4750]: I1008 19:42:43.634605 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l297w" event={"ID":"86cd538e-e4cb-426b-afe9-a151b52f518a","Type":"ContainerDied","Data":"bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea"} Oct 08 19:42:44 crc kubenswrapper[4750]: I1008 19:42:44.648850 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l297w" event={"ID":"86cd538e-e4cb-426b-afe9-a151b52f518a","Type":"ContainerStarted","Data":"ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200"} Oct 08 19:42:44 crc kubenswrapper[4750]: I1008 19:42:44.680940 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l297w" podStartSLOduration=3.17068595 podStartE2EDuration="5.680918559s" podCreationTimestamp="2025-10-08 19:42:39 +0000 UTC" firstStartedPulling="2025-10-08 19:42:41.612269063 +0000 UTC m=+5517.525240076" lastFinishedPulling="2025-10-08 19:42:44.122501662 +0000 UTC m=+5520.035472685" observedRunningTime="2025-10-08 19:42:44.676581351 +0000 UTC m=+5520.589552384" watchObservedRunningTime="2025-10-08 19:42:44.680918559 +0000 UTC m=+5520.593889582" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.108734 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.109443 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.162148 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.469283 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rc4g4"] Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.470911 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rc4g4" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.481437 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rc4g4"] Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.572349 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfcqd\" (UniqueName: \"kubernetes.io/projected/c2a86550-1abc-4058-98fb-aca6386d130b-kube-api-access-bfcqd\") pod \"neutron-db-create-rc4g4\" (UID: \"c2a86550-1abc-4058-98fb-aca6386d130b\") " pod="openstack/neutron-db-create-rc4g4" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.674445 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfcqd\" (UniqueName: \"kubernetes.io/projected/c2a86550-1abc-4058-98fb-aca6386d130b-kube-api-access-bfcqd\") pod \"neutron-db-create-rc4g4\" (UID: \"c2a86550-1abc-4058-98fb-aca6386d130b\") " pod="openstack/neutron-db-create-rc4g4" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.702537 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfcqd\" (UniqueName: \"kubernetes.io/projected/c2a86550-1abc-4058-98fb-aca6386d130b-kube-api-access-bfcqd\") pod \"neutron-db-create-rc4g4\" (UID: \"c2a86550-1abc-4058-98fb-aca6386d130b\") " pod="openstack/neutron-db-create-rc4g4" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.783763 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:50 crc kubenswrapper[4750]: I1008 19:42:50.799642 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rc4g4" Oct 08 19:42:51 crc kubenswrapper[4750]: I1008 19:42:51.116289 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rc4g4"] Oct 08 19:42:51 crc kubenswrapper[4750]: I1008 19:42:51.605574 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l297w"] Oct 08 19:42:51 crc kubenswrapper[4750]: I1008 19:42:51.736505 4750 generic.go:334] "Generic (PLEG): container finished" podID="c2a86550-1abc-4058-98fb-aca6386d130b" containerID="08772aaace6893b2ec900936b3a724a48803643bc4eca66c8a2d1539158254b1" exitCode=0 Oct 08 19:42:51 crc kubenswrapper[4750]: I1008 19:42:51.736753 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rc4g4" event={"ID":"c2a86550-1abc-4058-98fb-aca6386d130b","Type":"ContainerDied","Data":"08772aaace6893b2ec900936b3a724a48803643bc4eca66c8a2d1539158254b1"} Oct 08 19:42:51 crc kubenswrapper[4750]: I1008 19:42:51.736836 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rc4g4" event={"ID":"c2a86550-1abc-4058-98fb-aca6386d130b","Type":"ContainerStarted","Data":"77467fe2b0892f2eb282076c652a486328bd2a89028148272ff54f040778d78d"} Oct 08 19:42:52 crc kubenswrapper[4750]: I1008 19:42:52.748042 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l297w" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerName="registry-server" containerID="cri-o://ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200" gracePeriod=2 Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.216934 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rc4g4" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.227267 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.239223 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-catalog-content\") pod \"86cd538e-e4cb-426b-afe9-a151b52f518a\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.239387 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b9qs\" (UniqueName: \"kubernetes.io/projected/86cd538e-e4cb-426b-afe9-a151b52f518a-kube-api-access-7b9qs\") pod \"86cd538e-e4cb-426b-afe9-a151b52f518a\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.239518 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfcqd\" (UniqueName: \"kubernetes.io/projected/c2a86550-1abc-4058-98fb-aca6386d130b-kube-api-access-bfcqd\") pod \"c2a86550-1abc-4058-98fb-aca6386d130b\" (UID: \"c2a86550-1abc-4058-98fb-aca6386d130b\") " Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.240036 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-utilities\") pod \"86cd538e-e4cb-426b-afe9-a151b52f518a\" (UID: \"86cd538e-e4cb-426b-afe9-a151b52f518a\") " Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.242687 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-utilities" (OuterVolumeSpecName: "utilities") pod "86cd538e-e4cb-426b-afe9-a151b52f518a" (UID: "86cd538e-e4cb-426b-afe9-a151b52f518a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.248964 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a86550-1abc-4058-98fb-aca6386d130b-kube-api-access-bfcqd" (OuterVolumeSpecName: "kube-api-access-bfcqd") pod "c2a86550-1abc-4058-98fb-aca6386d130b" (UID: "c2a86550-1abc-4058-98fb-aca6386d130b"). InnerVolumeSpecName "kube-api-access-bfcqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.259424 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86cd538e-e4cb-426b-afe9-a151b52f518a-kube-api-access-7b9qs" (OuterVolumeSpecName: "kube-api-access-7b9qs") pod "86cd538e-e4cb-426b-afe9-a151b52f518a" (UID: "86cd538e-e4cb-426b-afe9-a151b52f518a"). InnerVolumeSpecName "kube-api-access-7b9qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.343018 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.343335 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b9qs\" (UniqueName: \"kubernetes.io/projected/86cd538e-e4cb-426b-afe9-a151b52f518a-kube-api-access-7b9qs\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.343451 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfcqd\" (UniqueName: \"kubernetes.io/projected/c2a86550-1abc-4058-98fb-aca6386d130b-kube-api-access-bfcqd\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.639993 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86cd538e-e4cb-426b-afe9-a151b52f518a" (UID: "86cd538e-e4cb-426b-afe9-a151b52f518a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.649186 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86cd538e-e4cb-426b-afe9-a151b52f518a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.762754 4750 generic.go:334] "Generic (PLEG): container finished" podID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerID="ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200" exitCode=0 Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.762853 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l297w" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.762866 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l297w" event={"ID":"86cd538e-e4cb-426b-afe9-a151b52f518a","Type":"ContainerDied","Data":"ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200"} Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.763983 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l297w" event={"ID":"86cd538e-e4cb-426b-afe9-a151b52f518a","Type":"ContainerDied","Data":"14ad8c0a171524c555e22035ff17a39d7ecb4a13384d4e334192b40f7979f582"} Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.764054 4750 scope.go:117] "RemoveContainer" containerID="ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.768136 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rc4g4" event={"ID":"c2a86550-1abc-4058-98fb-aca6386d130b","Type":"ContainerDied","Data":"77467fe2b0892f2eb282076c652a486328bd2a89028148272ff54f040778d78d"} Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.768276 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77467fe2b0892f2eb282076c652a486328bd2a89028148272ff54f040778d78d" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.768410 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rc4g4" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.795608 4750 scope.go:117] "RemoveContainer" containerID="bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.806649 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l297w"] Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.814483 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l297w"] Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.830375 4750 scope.go:117] "RemoveContainer" containerID="804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.850023 4750 scope.go:117] "RemoveContainer" containerID="ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200" Oct 08 19:42:53 crc kubenswrapper[4750]: E1008 19:42:53.850498 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200\": container with ID starting with ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200 not found: ID does not exist" containerID="ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.850534 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200"} err="failed to get container status \"ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200\": rpc error: code = NotFound desc = could not find container \"ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200\": container with ID starting with ac64997b2045bb7a5f6e6b37682f5f7e3ae30907c196242ecb0ede9ffe5d4200 not found: ID does not exist" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.850572 4750 scope.go:117] "RemoveContainer" containerID="bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea" Oct 08 19:42:53 crc kubenswrapper[4750]: E1008 19:42:53.850968 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea\": container with ID starting with bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea not found: ID does not exist" containerID="bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.851081 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea"} err="failed to get container status \"bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea\": rpc error: code = NotFound desc = could not find container \"bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea\": container with ID starting with bab5e2bb1b86b0498761c46d859b86a355019154937cc1ac17d914d79ea7b5ea not found: ID does not exist" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.851171 4750 scope.go:117] "RemoveContainer" containerID="804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4" Oct 08 19:42:53 crc kubenswrapper[4750]: E1008 19:42:53.851675 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4\": container with ID starting with 804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4 not found: ID does not exist" containerID="804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4" Oct 08 19:42:53 crc kubenswrapper[4750]: I1008 19:42:53.851787 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4"} err="failed to get container status \"804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4\": rpc error: code = NotFound desc = could not find container \"804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4\": container with ID starting with 804cc0c7bbe4d36f999a04e18b64475429db8779846faee7fe97db3a7f35dec4 not found: ID does not exist" Oct 08 19:42:54 crc kubenswrapper[4750]: I1008 19:42:54.758627 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" path="/var/lib/kubelet/pods/86cd538e-e4cb-426b-afe9-a151b52f518a/volumes" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.644392 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8c11-account-create-d29w4"] Oct 08 19:43:00 crc kubenswrapper[4750]: E1008 19:43:00.645615 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerName="registry-server" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.645643 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerName="registry-server" Oct 08 19:43:00 crc kubenswrapper[4750]: E1008 19:43:00.645668 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerName="extract-content" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.645683 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerName="extract-content" Oct 08 19:43:00 crc kubenswrapper[4750]: E1008 19:43:00.645706 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerName="extract-utilities" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.645719 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerName="extract-utilities" Oct 08 19:43:00 crc kubenswrapper[4750]: E1008 19:43:00.645759 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a86550-1abc-4058-98fb-aca6386d130b" containerName="mariadb-database-create" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.645772 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a86550-1abc-4058-98fb-aca6386d130b" containerName="mariadb-database-create" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.646083 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a86550-1abc-4058-98fb-aca6386d130b" containerName="mariadb-database-create" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.646136 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="86cd538e-e4cb-426b-afe9-a151b52f518a" containerName="registry-server" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.647117 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8c11-account-create-d29w4" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.654002 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.661923 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8c11-account-create-d29w4"] Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.700318 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsjph\" (UniqueName: \"kubernetes.io/projected/cfc3046e-6de5-43ad-af62-252fa56a5c01-kube-api-access-jsjph\") pod \"neutron-8c11-account-create-d29w4\" (UID: \"cfc3046e-6de5-43ad-af62-252fa56a5c01\") " pod="openstack/neutron-8c11-account-create-d29w4" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.804736 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsjph\" (UniqueName: \"kubernetes.io/projected/cfc3046e-6de5-43ad-af62-252fa56a5c01-kube-api-access-jsjph\") pod \"neutron-8c11-account-create-d29w4\" (UID: \"cfc3046e-6de5-43ad-af62-252fa56a5c01\") " pod="openstack/neutron-8c11-account-create-d29w4" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.856654 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsjph\" (UniqueName: \"kubernetes.io/projected/cfc3046e-6de5-43ad-af62-252fa56a5c01-kube-api-access-jsjph\") pod \"neutron-8c11-account-create-d29w4\" (UID: \"cfc3046e-6de5-43ad-af62-252fa56a5c01\") " pod="openstack/neutron-8c11-account-create-d29w4" Oct 08 19:43:00 crc kubenswrapper[4750]: I1008 19:43:00.986014 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8c11-account-create-d29w4" Oct 08 19:43:01 crc kubenswrapper[4750]: I1008 19:43:01.501156 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8c11-account-create-d29w4"] Oct 08 19:43:01 crc kubenswrapper[4750]: I1008 19:43:01.858477 4750 generic.go:334] "Generic (PLEG): container finished" podID="cfc3046e-6de5-43ad-af62-252fa56a5c01" containerID="11951490ed43f279736dc0e8be8a35da0371c45b7c7133ae81d656fe4f399c8b" exitCode=0 Oct 08 19:43:01 crc kubenswrapper[4750]: I1008 19:43:01.858612 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8c11-account-create-d29w4" event={"ID":"cfc3046e-6de5-43ad-af62-252fa56a5c01","Type":"ContainerDied","Data":"11951490ed43f279736dc0e8be8a35da0371c45b7c7133ae81d656fe4f399c8b"} Oct 08 19:43:01 crc kubenswrapper[4750]: I1008 19:43:01.860060 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8c11-account-create-d29w4" event={"ID":"cfc3046e-6de5-43ad-af62-252fa56a5c01","Type":"ContainerStarted","Data":"8abc22f4c7d60990839ad5f9ff20958ab80f23843e796d4b6302560ccb5e70ee"} Oct 08 19:43:03 crc kubenswrapper[4750]: I1008 19:43:03.209441 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8c11-account-create-d29w4" Oct 08 19:43:03 crc kubenswrapper[4750]: I1008 19:43:03.359155 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsjph\" (UniqueName: \"kubernetes.io/projected/cfc3046e-6de5-43ad-af62-252fa56a5c01-kube-api-access-jsjph\") pod \"cfc3046e-6de5-43ad-af62-252fa56a5c01\" (UID: \"cfc3046e-6de5-43ad-af62-252fa56a5c01\") " Oct 08 19:43:03 crc kubenswrapper[4750]: I1008 19:43:03.368137 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc3046e-6de5-43ad-af62-252fa56a5c01-kube-api-access-jsjph" (OuterVolumeSpecName: "kube-api-access-jsjph") pod "cfc3046e-6de5-43ad-af62-252fa56a5c01" (UID: "cfc3046e-6de5-43ad-af62-252fa56a5c01"). InnerVolumeSpecName "kube-api-access-jsjph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:43:03 crc kubenswrapper[4750]: I1008 19:43:03.462086 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsjph\" (UniqueName: \"kubernetes.io/projected/cfc3046e-6de5-43ad-af62-252fa56a5c01-kube-api-access-jsjph\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:03 crc kubenswrapper[4750]: I1008 19:43:03.887184 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8c11-account-create-d29w4" event={"ID":"cfc3046e-6de5-43ad-af62-252fa56a5c01","Type":"ContainerDied","Data":"8abc22f4c7d60990839ad5f9ff20958ab80f23843e796d4b6302560ccb5e70ee"} Oct 08 19:43:03 crc kubenswrapper[4750]: I1008 19:43:03.887284 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8abc22f4c7d60990839ad5f9ff20958ab80f23843e796d4b6302560ccb5e70ee" Oct 08 19:43:03 crc kubenswrapper[4750]: I1008 19:43:03.887205 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8c11-account-create-d29w4" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.770090 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-x7x4q"] Oct 08 19:43:05 crc kubenswrapper[4750]: E1008 19:43:05.771413 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc3046e-6de5-43ad-af62-252fa56a5c01" containerName="mariadb-account-create" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.771435 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc3046e-6de5-43ad-af62-252fa56a5c01" containerName="mariadb-account-create" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.772115 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc3046e-6de5-43ad-af62-252fa56a5c01" containerName="mariadb-account-create" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.773129 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.778162 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7xlxc" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.778899 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.779078 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-x7x4q"] Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.781864 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.913904 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-config\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.914014 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-combined-ca-bundle\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:05 crc kubenswrapper[4750]: I1008 19:43:05.914274 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p6kf\" (UniqueName: \"kubernetes.io/projected/e6a65953-f112-4073-af48-43f88cff1bb9-kube-api-access-4p6kf\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.016131 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p6kf\" (UniqueName: \"kubernetes.io/projected/e6a65953-f112-4073-af48-43f88cff1bb9-kube-api-access-4p6kf\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.016273 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-config\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.016348 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-combined-ca-bundle\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.025821 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-config\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.029278 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-combined-ca-bundle\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.042895 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p6kf\" (UniqueName: \"kubernetes.io/projected/e6a65953-f112-4073-af48-43f88cff1bb9-kube-api-access-4p6kf\") pod \"neutron-db-sync-x7x4q\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.103786 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.641875 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-x7x4q"] Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.924198 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-x7x4q" event={"ID":"e6a65953-f112-4073-af48-43f88cff1bb9","Type":"ContainerStarted","Data":"41a3e7e12edb562fe1e15fc62faebf51789a62d568c396396e919b3fb68d9ced"} Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.925009 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-x7x4q" event={"ID":"e6a65953-f112-4073-af48-43f88cff1bb9","Type":"ContainerStarted","Data":"c93f781fc34cfe1ce4545e0857b8336cd7546c627cc439f67c92c2f083070328"} Oct 08 19:43:06 crc kubenswrapper[4750]: I1008 19:43:06.954669 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-x7x4q" podStartSLOduration=1.954639649 podStartE2EDuration="1.954639649s" podCreationTimestamp="2025-10-08 19:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:43:06.942354742 +0000 UTC m=+5542.855325795" watchObservedRunningTime="2025-10-08 19:43:06.954639649 +0000 UTC m=+5542.867610672" Oct 08 19:43:12 crc kubenswrapper[4750]: I1008 19:43:12.010654 4750 generic.go:334] "Generic (PLEG): container finished" podID="e6a65953-f112-4073-af48-43f88cff1bb9" containerID="41a3e7e12edb562fe1e15fc62faebf51789a62d568c396396e919b3fb68d9ced" exitCode=0 Oct 08 19:43:12 crc kubenswrapper[4750]: I1008 19:43:12.010747 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-x7x4q" event={"ID":"e6a65953-f112-4073-af48-43f88cff1bb9","Type":"ContainerDied","Data":"41a3e7e12edb562fe1e15fc62faebf51789a62d568c396396e919b3fb68d9ced"} Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.441251 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.575295 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p6kf\" (UniqueName: \"kubernetes.io/projected/e6a65953-f112-4073-af48-43f88cff1bb9-kube-api-access-4p6kf\") pod \"e6a65953-f112-4073-af48-43f88cff1bb9\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.575415 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-combined-ca-bundle\") pod \"e6a65953-f112-4073-af48-43f88cff1bb9\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.575795 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-config\") pod \"e6a65953-f112-4073-af48-43f88cff1bb9\" (UID: \"e6a65953-f112-4073-af48-43f88cff1bb9\") " Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.584989 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a65953-f112-4073-af48-43f88cff1bb9-kube-api-access-4p6kf" (OuterVolumeSpecName: "kube-api-access-4p6kf") pod "e6a65953-f112-4073-af48-43f88cff1bb9" (UID: "e6a65953-f112-4073-af48-43f88cff1bb9"). InnerVolumeSpecName "kube-api-access-4p6kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.604995 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6a65953-f112-4073-af48-43f88cff1bb9" (UID: "e6a65953-f112-4073-af48-43f88cff1bb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.612714 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-config" (OuterVolumeSpecName: "config") pod "e6a65953-f112-4073-af48-43f88cff1bb9" (UID: "e6a65953-f112-4073-af48-43f88cff1bb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.680075 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.680143 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p6kf\" (UniqueName: \"kubernetes.io/projected/e6a65953-f112-4073-af48-43f88cff1bb9-kube-api-access-4p6kf\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:13 crc kubenswrapper[4750]: I1008 19:43:13.680170 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a65953-f112-4073-af48-43f88cff1bb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.035996 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-x7x4q" event={"ID":"e6a65953-f112-4073-af48-43f88cff1bb9","Type":"ContainerDied","Data":"c93f781fc34cfe1ce4545e0857b8336cd7546c627cc439f67c92c2f083070328"} Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.036479 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c93f781fc34cfe1ce4545e0857b8336cd7546c627cc439f67c92c2f083070328" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.036077 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-x7x4q" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.211003 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dd4959d75-22kgk"] Oct 08 19:43:14 crc kubenswrapper[4750]: E1008 19:43:14.211573 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a65953-f112-4073-af48-43f88cff1bb9" containerName="neutron-db-sync" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.211596 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a65953-f112-4073-af48-43f88cff1bb9" containerName="neutron-db-sync" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.211875 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a65953-f112-4073-af48-43f88cff1bb9" containerName="neutron-db-sync" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.217964 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.251964 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dd4959d75-22kgk"] Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.295645 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-config\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.295720 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87nkp\" (UniqueName: \"kubernetes.io/projected/c146308f-d2f5-4ae6-92b9-289dd99922ad-kube-api-access-87nkp\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.295771 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-dns-svc\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.295821 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.295855 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.356542 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-548785f96f-fcdrq"] Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.358372 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.361279 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7xlxc" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.361540 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.363931 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.374184 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-548785f96f-fcdrq"] Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.397918 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-config\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.398000 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87nkp\" (UniqueName: \"kubernetes.io/projected/c146308f-d2f5-4ae6-92b9-289dd99922ad-kube-api-access-87nkp\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.398040 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-dns-svc\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.398087 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.398120 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.399956 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-config\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.400696 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-dns-svc\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.400956 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.401399 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.434998 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87nkp\" (UniqueName: \"kubernetes.io/projected/c146308f-d2f5-4ae6-92b9-289dd99922ad-kube-api-access-87nkp\") pod \"dnsmasq-dns-5dd4959d75-22kgk\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.499616 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-combined-ca-bundle\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.500844 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-config\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.500992 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcd8h\" (UniqueName: \"kubernetes.io/projected/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-kube-api-access-wcd8h\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.501515 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-httpd-config\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.568054 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.606884 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-httpd-config\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.607195 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-combined-ca-bundle\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.607376 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd8h\" (UniqueName: \"kubernetes.io/projected/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-kube-api-access-wcd8h\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.607459 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-config\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.612410 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-combined-ca-bundle\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.613265 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-httpd-config\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.616501 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-config\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.643100 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcd8h\" (UniqueName: \"kubernetes.io/projected/3df4a146-e321-4c4b-86f8-dc0a22aeb45b-kube-api-access-wcd8h\") pod \"neutron-548785f96f-fcdrq\" (UID: \"3df4a146-e321-4c4b-86f8-dc0a22aeb45b\") " pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:14 crc kubenswrapper[4750]: I1008 19:43:14.684520 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:15 crc kubenswrapper[4750]: I1008 19:43:15.078632 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dd4959d75-22kgk"] Oct 08 19:43:15 crc kubenswrapper[4750]: I1008 19:43:15.623766 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-548785f96f-fcdrq"] Oct 08 19:43:16 crc kubenswrapper[4750]: I1008 19:43:16.071283 4750 generic.go:334] "Generic (PLEG): container finished" podID="c146308f-d2f5-4ae6-92b9-289dd99922ad" containerID="116317564a520144e196ebb5c07616deaf7fdf6e10ad027224fb0f77947b76ea" exitCode=0 Oct 08 19:43:16 crc kubenswrapper[4750]: I1008 19:43:16.071352 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" event={"ID":"c146308f-d2f5-4ae6-92b9-289dd99922ad","Type":"ContainerDied","Data":"116317564a520144e196ebb5c07616deaf7fdf6e10ad027224fb0f77947b76ea"} Oct 08 19:43:16 crc kubenswrapper[4750]: I1008 19:43:16.071432 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" event={"ID":"c146308f-d2f5-4ae6-92b9-289dd99922ad","Type":"ContainerStarted","Data":"e0da70fda9afa5fccafaa4202374fbfa03c108f342c6eef60d934351f37c700e"} Oct 08 19:43:16 crc kubenswrapper[4750]: I1008 19:43:16.075220 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548785f96f-fcdrq" event={"ID":"3df4a146-e321-4c4b-86f8-dc0a22aeb45b","Type":"ContainerStarted","Data":"20c30fbc1e5208767e2c4bc16a6af057037c7287b111504a424ed566425607b9"} Oct 08 19:43:16 crc kubenswrapper[4750]: I1008 19:43:16.075352 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548785f96f-fcdrq" event={"ID":"3df4a146-e321-4c4b-86f8-dc0a22aeb45b","Type":"ContainerStarted","Data":"e23309f3d64397bfb2b07247c392f4f29ff947c91cc70cc030b24a8391703a74"} Oct 08 19:43:16 crc kubenswrapper[4750]: I1008 19:43:16.075416 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548785f96f-fcdrq" event={"ID":"3df4a146-e321-4c4b-86f8-dc0a22aeb45b","Type":"ContainerStarted","Data":"54d4cd9acc6fff97e1cb740175cdc3f03bb4d1534f08031032b81bf0f66f16fd"} Oct 08 19:43:16 crc kubenswrapper[4750]: I1008 19:43:16.075485 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:16 crc kubenswrapper[4750]: I1008 19:43:16.124775 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-548785f96f-fcdrq" podStartSLOduration=2.12474421 podStartE2EDuration="2.12474421s" podCreationTimestamp="2025-10-08 19:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:43:16.118244029 +0000 UTC m=+5552.031215052" watchObservedRunningTime="2025-10-08 19:43:16.12474421 +0000 UTC m=+5552.037715223" Oct 08 19:43:17 crc kubenswrapper[4750]: I1008 19:43:17.102643 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" event={"ID":"c146308f-d2f5-4ae6-92b9-289dd99922ad","Type":"ContainerStarted","Data":"29ee4b013b9e27d657267adbbc929c79032c1184cba1c5f7bf9f7c489916a04c"} Oct 08 19:43:17 crc kubenswrapper[4750]: I1008 19:43:17.102942 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:17 crc kubenswrapper[4750]: I1008 19:43:17.126111 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" podStartSLOduration=3.126082224 podStartE2EDuration="3.126082224s" podCreationTimestamp="2025-10-08 19:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:43:17.125731866 +0000 UTC m=+5553.038702879" watchObservedRunningTime="2025-10-08 19:43:17.126082224 +0000 UTC m=+5553.039053237" Oct 08 19:43:24 crc kubenswrapper[4750]: I1008 19:43:24.570327 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:43:24 crc kubenswrapper[4750]: I1008 19:43:24.652042 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cbfd88b5-wzct4"] Oct 08 19:43:24 crc kubenswrapper[4750]: I1008 19:43:24.652452 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" podUID="4db79ea6-2794-442e-9137-255fd42641ad" containerName="dnsmasq-dns" containerID="cri-o://522b3161de3b7c218b9e4cf7c883e0256eab0b803c7415d22f4513858a819244" gracePeriod=10 Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.189038 4750 generic.go:334] "Generic (PLEG): container finished" podID="4db79ea6-2794-442e-9137-255fd42641ad" containerID="522b3161de3b7c218b9e4cf7c883e0256eab0b803c7415d22f4513858a819244" exitCode=0 Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.189132 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" event={"ID":"4db79ea6-2794-442e-9137-255fd42641ad","Type":"ContainerDied","Data":"522b3161de3b7c218b9e4cf7c883e0256eab0b803c7415d22f4513858a819244"} Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.189435 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" event={"ID":"4db79ea6-2794-442e-9137-255fd42641ad","Type":"ContainerDied","Data":"9deecc194d9fda126c3b372e08924838e3368327953158004752aff03edd2a74"} Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.189455 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9deecc194d9fda126c3b372e08924838e3368327953158004752aff03edd2a74" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.217428 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.245256 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6tv8\" (UniqueName: \"kubernetes.io/projected/4db79ea6-2794-442e-9137-255fd42641ad-kube-api-access-x6tv8\") pod \"4db79ea6-2794-442e-9137-255fd42641ad\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.245339 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-dns-svc\") pod \"4db79ea6-2794-442e-9137-255fd42641ad\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.245408 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-nb\") pod \"4db79ea6-2794-442e-9137-255fd42641ad\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.245491 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-config\") pod \"4db79ea6-2794-442e-9137-255fd42641ad\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.245527 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-sb\") pod \"4db79ea6-2794-442e-9137-255fd42641ad\" (UID: \"4db79ea6-2794-442e-9137-255fd42641ad\") " Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.258911 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4db79ea6-2794-442e-9137-255fd42641ad-kube-api-access-x6tv8" (OuterVolumeSpecName: "kube-api-access-x6tv8") pod "4db79ea6-2794-442e-9137-255fd42641ad" (UID: "4db79ea6-2794-442e-9137-255fd42641ad"). InnerVolumeSpecName "kube-api-access-x6tv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.315446 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-config" (OuterVolumeSpecName: "config") pod "4db79ea6-2794-442e-9137-255fd42641ad" (UID: "4db79ea6-2794-442e-9137-255fd42641ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.332183 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4db79ea6-2794-442e-9137-255fd42641ad" (UID: "4db79ea6-2794-442e-9137-255fd42641ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.343779 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4db79ea6-2794-442e-9137-255fd42641ad" (UID: "4db79ea6-2794-442e-9137-255fd42641ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.344497 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4db79ea6-2794-442e-9137-255fd42641ad" (UID: "4db79ea6-2794-442e-9137-255fd42641ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.349304 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.349346 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.349361 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.349370 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db79ea6-2794-442e-9137-255fd42641ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:25 crc kubenswrapper[4750]: I1008 19:43:25.349380 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6tv8\" (UniqueName: \"kubernetes.io/projected/4db79ea6-2794-442e-9137-255fd42641ad-kube-api-access-x6tv8\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:26 crc kubenswrapper[4750]: I1008 19:43:26.197155 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cbfd88b5-wzct4" Oct 08 19:43:26 crc kubenswrapper[4750]: I1008 19:43:26.236077 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cbfd88b5-wzct4"] Oct 08 19:43:26 crc kubenswrapper[4750]: I1008 19:43:26.243269 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8cbfd88b5-wzct4"] Oct 08 19:43:26 crc kubenswrapper[4750]: I1008 19:43:26.744718 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4db79ea6-2794-442e-9137-255fd42641ad" path="/var/lib/kubelet/pods/4db79ea6-2794-442e-9137-255fd42641ad/volumes" Oct 08 19:43:29 crc kubenswrapper[4750]: I1008 19:43:29.706971 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:43:29 crc kubenswrapper[4750]: I1008 19:43:29.707622 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:43:44 crc kubenswrapper[4750]: I1008 19:43:44.703231 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-548785f96f-fcdrq" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.240324 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9bnqf"] Oct 08 19:43:53 crc kubenswrapper[4750]: E1008 19:43:53.241334 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db79ea6-2794-442e-9137-255fd42641ad" containerName="dnsmasq-dns" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.241354 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db79ea6-2794-442e-9137-255fd42641ad" containerName="dnsmasq-dns" Oct 08 19:43:53 crc kubenswrapper[4750]: E1008 19:43:53.241377 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db79ea6-2794-442e-9137-255fd42641ad" containerName="init" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.241386 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db79ea6-2794-442e-9137-255fd42641ad" containerName="init" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.241629 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4db79ea6-2794-442e-9137-255fd42641ad" containerName="dnsmasq-dns" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.242331 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9bnqf" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.257043 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9bnqf"] Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.389697 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xvp9\" (UniqueName: \"kubernetes.io/projected/860a7d97-a324-43f6-a675-13a2c7ee4189-kube-api-access-9xvp9\") pod \"glance-db-create-9bnqf\" (UID: \"860a7d97-a324-43f6-a675-13a2c7ee4189\") " pod="openstack/glance-db-create-9bnqf" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.491929 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xvp9\" (UniqueName: \"kubernetes.io/projected/860a7d97-a324-43f6-a675-13a2c7ee4189-kube-api-access-9xvp9\") pod \"glance-db-create-9bnqf\" (UID: \"860a7d97-a324-43f6-a675-13a2c7ee4189\") " pod="openstack/glance-db-create-9bnqf" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.511306 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xvp9\" (UniqueName: \"kubernetes.io/projected/860a7d97-a324-43f6-a675-13a2c7ee4189-kube-api-access-9xvp9\") pod \"glance-db-create-9bnqf\" (UID: \"860a7d97-a324-43f6-a675-13a2c7ee4189\") " pod="openstack/glance-db-create-9bnqf" Oct 08 19:43:53 crc kubenswrapper[4750]: I1008 19:43:53.567103 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9bnqf" Oct 08 19:43:54 crc kubenswrapper[4750]: I1008 19:43:54.006142 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9bnqf"] Oct 08 19:43:54 crc kubenswrapper[4750]: I1008 19:43:54.566379 4750 generic.go:334] "Generic (PLEG): container finished" podID="860a7d97-a324-43f6-a675-13a2c7ee4189" containerID="3af76fafbe6a9cd3fb5d157bfdc03a0c491293cf8e9b961d4c26a90a83e39f6a" exitCode=0 Oct 08 19:43:54 crc kubenswrapper[4750]: I1008 19:43:54.566476 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9bnqf" event={"ID":"860a7d97-a324-43f6-a675-13a2c7ee4189","Type":"ContainerDied","Data":"3af76fafbe6a9cd3fb5d157bfdc03a0c491293cf8e9b961d4c26a90a83e39f6a"} Oct 08 19:43:54 crc kubenswrapper[4750]: I1008 19:43:54.566761 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9bnqf" event={"ID":"860a7d97-a324-43f6-a675-13a2c7ee4189","Type":"ContainerStarted","Data":"6f4caffe2fa8c849b19d60064d12cf5b681fba51ec07cb20b03826508d440702"} Oct 08 19:43:55 crc kubenswrapper[4750]: I1008 19:43:55.931509 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9bnqf" Oct 08 19:43:56 crc kubenswrapper[4750]: I1008 19:43:56.042483 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xvp9\" (UniqueName: \"kubernetes.io/projected/860a7d97-a324-43f6-a675-13a2c7ee4189-kube-api-access-9xvp9\") pod \"860a7d97-a324-43f6-a675-13a2c7ee4189\" (UID: \"860a7d97-a324-43f6-a675-13a2c7ee4189\") " Oct 08 19:43:56 crc kubenswrapper[4750]: I1008 19:43:56.050075 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860a7d97-a324-43f6-a675-13a2c7ee4189-kube-api-access-9xvp9" (OuterVolumeSpecName: "kube-api-access-9xvp9") pod "860a7d97-a324-43f6-a675-13a2c7ee4189" (UID: "860a7d97-a324-43f6-a675-13a2c7ee4189"). InnerVolumeSpecName "kube-api-access-9xvp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:43:56 crc kubenswrapper[4750]: I1008 19:43:56.145886 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xvp9\" (UniqueName: \"kubernetes.io/projected/860a7d97-a324-43f6-a675-13a2c7ee4189-kube-api-access-9xvp9\") on node \"crc\" DevicePath \"\"" Oct 08 19:43:56 crc kubenswrapper[4750]: I1008 19:43:56.615191 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9bnqf" event={"ID":"860a7d97-a324-43f6-a675-13a2c7ee4189","Type":"ContainerDied","Data":"6f4caffe2fa8c849b19d60064d12cf5b681fba51ec07cb20b03826508d440702"} Oct 08 19:43:56 crc kubenswrapper[4750]: I1008 19:43:56.615257 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f4caffe2fa8c849b19d60064d12cf5b681fba51ec07cb20b03826508d440702" Oct 08 19:43:56 crc kubenswrapper[4750]: I1008 19:43:56.615345 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9bnqf" Oct 08 19:43:59 crc kubenswrapper[4750]: I1008 19:43:59.707111 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:43:59 crc kubenswrapper[4750]: I1008 19:43:59.707615 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.340623 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-37e5-account-create-rgh78"] Oct 08 19:44:03 crc kubenswrapper[4750]: E1008 19:44:03.341759 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860a7d97-a324-43f6-a675-13a2c7ee4189" containerName="mariadb-database-create" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.341779 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="860a7d97-a324-43f6-a675-13a2c7ee4189" containerName="mariadb-database-create" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.342001 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="860a7d97-a324-43f6-a675-13a2c7ee4189" containerName="mariadb-database-create" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.342835 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-37e5-account-create-rgh78" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.346382 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.360338 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-37e5-account-create-rgh78"] Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.399337 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6r8r\" (UniqueName: \"kubernetes.io/projected/a0221949-25f4-467f-9b45-d06d2e0057c1-kube-api-access-p6r8r\") pod \"glance-37e5-account-create-rgh78\" (UID: \"a0221949-25f4-467f-9b45-d06d2e0057c1\") " pod="openstack/glance-37e5-account-create-rgh78" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.502255 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6r8r\" (UniqueName: \"kubernetes.io/projected/a0221949-25f4-467f-9b45-d06d2e0057c1-kube-api-access-p6r8r\") pod \"glance-37e5-account-create-rgh78\" (UID: \"a0221949-25f4-467f-9b45-d06d2e0057c1\") " pod="openstack/glance-37e5-account-create-rgh78" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.534707 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6r8r\" (UniqueName: \"kubernetes.io/projected/a0221949-25f4-467f-9b45-d06d2e0057c1-kube-api-access-p6r8r\") pod \"glance-37e5-account-create-rgh78\" (UID: \"a0221949-25f4-467f-9b45-d06d2e0057c1\") " pod="openstack/glance-37e5-account-create-rgh78" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.621873 4750 scope.go:117] "RemoveContainer" containerID="e43d5f1ca7af778e5e2dc05bfebf82183bce50911e4a8418399894f93c50aaf5" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.651614 4750 scope.go:117] "RemoveContainer" containerID="5b4ea5eeddf85fce23d62c88367bd935fdfe85e9f6f36fce3510e6002e668923" Oct 08 19:44:03 crc kubenswrapper[4750]: I1008 19:44:03.676405 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-37e5-account-create-rgh78" Oct 08 19:44:04 crc kubenswrapper[4750]: I1008 19:44:04.260757 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-37e5-account-create-rgh78"] Oct 08 19:44:04 crc kubenswrapper[4750]: I1008 19:44:04.706273 4750 generic.go:334] "Generic (PLEG): container finished" podID="a0221949-25f4-467f-9b45-d06d2e0057c1" containerID="05c29c0c6dce72fe94e80fbd924aee24534ee0f5b345fe31cae06418b982fa28" exitCode=0 Oct 08 19:44:04 crc kubenswrapper[4750]: I1008 19:44:04.706337 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-37e5-account-create-rgh78" event={"ID":"a0221949-25f4-467f-9b45-d06d2e0057c1","Type":"ContainerDied","Data":"05c29c0c6dce72fe94e80fbd924aee24534ee0f5b345fe31cae06418b982fa28"} Oct 08 19:44:04 crc kubenswrapper[4750]: I1008 19:44:04.706368 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-37e5-account-create-rgh78" event={"ID":"a0221949-25f4-467f-9b45-d06d2e0057c1","Type":"ContainerStarted","Data":"5fd33d713e1e9a58ca265ceab322d89bcaf215e8180c06803796fdd25e6c4815"} Oct 08 19:44:06 crc kubenswrapper[4750]: I1008 19:44:06.114358 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-37e5-account-create-rgh78" Oct 08 19:44:06 crc kubenswrapper[4750]: I1008 19:44:06.162522 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6r8r\" (UniqueName: \"kubernetes.io/projected/a0221949-25f4-467f-9b45-d06d2e0057c1-kube-api-access-p6r8r\") pod \"a0221949-25f4-467f-9b45-d06d2e0057c1\" (UID: \"a0221949-25f4-467f-9b45-d06d2e0057c1\") " Oct 08 19:44:06 crc kubenswrapper[4750]: I1008 19:44:06.173754 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0221949-25f4-467f-9b45-d06d2e0057c1-kube-api-access-p6r8r" (OuterVolumeSpecName: "kube-api-access-p6r8r") pod "a0221949-25f4-467f-9b45-d06d2e0057c1" (UID: "a0221949-25f4-467f-9b45-d06d2e0057c1"). InnerVolumeSpecName "kube-api-access-p6r8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:06 crc kubenswrapper[4750]: I1008 19:44:06.264960 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6r8r\" (UniqueName: \"kubernetes.io/projected/a0221949-25f4-467f-9b45-d06d2e0057c1-kube-api-access-p6r8r\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:06 crc kubenswrapper[4750]: I1008 19:44:06.731088 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-37e5-account-create-rgh78" event={"ID":"a0221949-25f4-467f-9b45-d06d2e0057c1","Type":"ContainerDied","Data":"5fd33d713e1e9a58ca265ceab322d89bcaf215e8180c06803796fdd25e6c4815"} Oct 08 19:44:06 crc kubenswrapper[4750]: I1008 19:44:06.731157 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fd33d713e1e9a58ca265ceab322d89bcaf215e8180c06803796fdd25e6c4815" Oct 08 19:44:06 crc kubenswrapper[4750]: I1008 19:44:06.731172 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-37e5-account-create-rgh78" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.564730 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vkmhg"] Oct 08 19:44:08 crc kubenswrapper[4750]: E1008 19:44:08.566696 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0221949-25f4-467f-9b45-d06d2e0057c1" containerName="mariadb-account-create" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.566780 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0221949-25f4-467f-9b45-d06d2e0057c1" containerName="mariadb-account-create" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.567029 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0221949-25f4-467f-9b45-d06d2e0057c1" containerName="mariadb-account-create" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.567795 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.569984 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.570678 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wp2zx" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.584328 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vkmhg"] Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.624172 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-combined-ca-bundle\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.624280 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-config-data\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.624438 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-db-sync-config-data\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.624540 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4v6q\" (UniqueName: \"kubernetes.io/projected/2e16b629-352a-4ef2-b318-5a342165dfef-kube-api-access-t4v6q\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.726958 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-combined-ca-bundle\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.727056 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-config-data\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.727104 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-db-sync-config-data\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.727133 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4v6q\" (UniqueName: \"kubernetes.io/projected/2e16b629-352a-4ef2-b318-5a342165dfef-kube-api-access-t4v6q\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.735640 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-db-sync-config-data\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.735743 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-combined-ca-bundle\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.745066 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-config-data\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.747786 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4v6q\" (UniqueName: \"kubernetes.io/projected/2e16b629-352a-4ef2-b318-5a342165dfef-kube-api-access-t4v6q\") pod \"glance-db-sync-vkmhg\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:08 crc kubenswrapper[4750]: I1008 19:44:08.889765 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:09 crc kubenswrapper[4750]: I1008 19:44:09.483028 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vkmhg"] Oct 08 19:44:09 crc kubenswrapper[4750]: I1008 19:44:09.785428 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vkmhg" event={"ID":"2e16b629-352a-4ef2-b318-5a342165dfef","Type":"ContainerStarted","Data":"96e83989ebec8a6b09d9f7f1cbf147d05f5e3d27fa187afd9603139e84e40e87"} Oct 08 19:44:10 crc kubenswrapper[4750]: I1008 19:44:10.797615 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vkmhg" event={"ID":"2e16b629-352a-4ef2-b318-5a342165dfef","Type":"ContainerStarted","Data":"2e25d2a914f228a8b6476c01c56486648c14db30e51db7a10c490d676f3928e0"} Oct 08 19:44:10 crc kubenswrapper[4750]: I1008 19:44:10.823406 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vkmhg" podStartSLOduration=2.823380933 podStartE2EDuration="2.823380933s" podCreationTimestamp="2025-10-08 19:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:44:10.81683795 +0000 UTC m=+5606.729808973" watchObservedRunningTime="2025-10-08 19:44:10.823380933 +0000 UTC m=+5606.736351956" Oct 08 19:44:13 crc kubenswrapper[4750]: I1008 19:44:13.834157 4750 generic.go:334] "Generic (PLEG): container finished" podID="2e16b629-352a-4ef2-b318-5a342165dfef" containerID="2e25d2a914f228a8b6476c01c56486648c14db30e51db7a10c490d676f3928e0" exitCode=0 Oct 08 19:44:13 crc kubenswrapper[4750]: I1008 19:44:13.834274 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vkmhg" event={"ID":"2e16b629-352a-4ef2-b318-5a342165dfef","Type":"ContainerDied","Data":"2e25d2a914f228a8b6476c01c56486648c14db30e51db7a10c490d676f3928e0"} Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.360673 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.469674 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-db-sync-config-data\") pod \"2e16b629-352a-4ef2-b318-5a342165dfef\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.469914 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-combined-ca-bundle\") pod \"2e16b629-352a-4ef2-b318-5a342165dfef\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.469979 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-config-data\") pod \"2e16b629-352a-4ef2-b318-5a342165dfef\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.470055 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4v6q\" (UniqueName: \"kubernetes.io/projected/2e16b629-352a-4ef2-b318-5a342165dfef-kube-api-access-t4v6q\") pod \"2e16b629-352a-4ef2-b318-5a342165dfef\" (UID: \"2e16b629-352a-4ef2-b318-5a342165dfef\") " Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.477926 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e16b629-352a-4ef2-b318-5a342165dfef-kube-api-access-t4v6q" (OuterVolumeSpecName: "kube-api-access-t4v6q") pod "2e16b629-352a-4ef2-b318-5a342165dfef" (UID: "2e16b629-352a-4ef2-b318-5a342165dfef"). InnerVolumeSpecName "kube-api-access-t4v6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.477942 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2e16b629-352a-4ef2-b318-5a342165dfef" (UID: "2e16b629-352a-4ef2-b318-5a342165dfef"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.511819 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e16b629-352a-4ef2-b318-5a342165dfef" (UID: "2e16b629-352a-4ef2-b318-5a342165dfef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.563117 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-config-data" (OuterVolumeSpecName: "config-data") pod "2e16b629-352a-4ef2-b318-5a342165dfef" (UID: "2e16b629-352a-4ef2-b318-5a342165dfef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.573259 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.573331 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.573356 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4v6q\" (UniqueName: \"kubernetes.io/projected/2e16b629-352a-4ef2-b318-5a342165dfef-kube-api-access-t4v6q\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.573380 4750 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e16b629-352a-4ef2-b318-5a342165dfef-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.858074 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vkmhg" event={"ID":"2e16b629-352a-4ef2-b318-5a342165dfef","Type":"ContainerDied","Data":"96e83989ebec8a6b09d9f7f1cbf147d05f5e3d27fa187afd9603139e84e40e87"} Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.858129 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e83989ebec8a6b09d9f7f1cbf147d05f5e3d27fa187afd9603139e84e40e87" Oct 08 19:44:15 crc kubenswrapper[4750]: I1008 19:44:15.858152 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vkmhg" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.212969 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:16 crc kubenswrapper[4750]: E1008 19:44:16.213570 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e16b629-352a-4ef2-b318-5a342165dfef" containerName="glance-db-sync" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.213593 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e16b629-352a-4ef2-b318-5a342165dfef" containerName="glance-db-sync" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.213832 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e16b629-352a-4ef2-b318-5a342165dfef" containerName="glance-db-sync" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.215220 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.218096 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wp2zx" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.218308 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.218494 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.218856 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.245253 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.286541 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkdm2\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-kube-api-access-mkdm2\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.286606 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-ceph\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.286638 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.286673 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.286701 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.286726 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.286752 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-logs\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.324520 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8646f84cbc-d7cc8"] Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.338371 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.345302 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8646f84cbc-d7cc8"] Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.388624 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkdm2\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-kube-api-access-mkdm2\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.388713 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-ceph\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.389351 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.389375 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.389434 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.389474 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.389508 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.389541 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-logs\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.389957 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-logs\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.399663 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.400536 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.401856 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.413953 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-ceph\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.417487 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkdm2\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-kube-api-access-mkdm2\") pod \"glance-default-external-api-0\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.464325 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.466225 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.472331 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.490849 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-config\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.490882 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l2cd\" (UniqueName: \"kubernetes.io/projected/78e70311-917b-4069-bf21-c1764c8237e4-kube-api-access-5l2cd\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.490923 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-dns-svc\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.490979 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.491001 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.509906 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.565366 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593199 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593289 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-config\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593317 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l2cd\" (UniqueName: \"kubernetes.io/projected/78e70311-917b-4069-bf21-c1764c8237e4-kube-api-access-5l2cd\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593337 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593362 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593391 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-dns-svc\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593412 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593468 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593495 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593516 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7ms2\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-kube-api-access-l7ms2\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593538 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.593606 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.594714 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-config\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.595756 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-dns-svc\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.596636 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.597524 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.624272 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l2cd\" (UniqueName: \"kubernetes.io/projected/78e70311-917b-4069-bf21-c1764c8237e4-kube-api-access-5l2cd\") pod \"dnsmasq-dns-8646f84cbc-d7cc8\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.678071 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.695060 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.695139 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.695207 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.695228 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7ms2\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-kube-api-access-l7ms2\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.695267 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.695304 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.695339 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.695890 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.696427 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.701030 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.710419 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.711100 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.716881 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.722379 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7ms2\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-kube-api-access-l7ms2\") pod \"glance-default-internal-api-0\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:16 crc kubenswrapper[4750]: I1008 19:44:16.810251 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.300525 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8646f84cbc-d7cc8"] Oct 08 19:44:17 crc kubenswrapper[4750]: W1008 19:44:17.308624 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78e70311_917b_4069_bf21_c1764c8237e4.slice/crio-842047b4e8271de33bf19e265dddd68317143b38c1d675751f471b06cf200412 WatchSource:0}: Error finding container 842047b4e8271de33bf19e265dddd68317143b38c1d675751f471b06cf200412: Status 404 returned error can't find the container with id 842047b4e8271de33bf19e265dddd68317143b38c1d675751f471b06cf200412 Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.435491 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.583376 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:17 crc kubenswrapper[4750]: W1008 19:44:17.587429 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1864ef9_8a9e_4352_ad90_c70027c40eba.slice/crio-1014a53dd16127d4bef7a1e1ee60da924ce3a8e8c7ca6f6250d1ea9dc7888692 WatchSource:0}: Error finding container 1014a53dd16127d4bef7a1e1ee60da924ce3a8e8c7ca6f6250d1ea9dc7888692: Status 404 returned error can't find the container with id 1014a53dd16127d4bef7a1e1ee60da924ce3a8e8c7ca6f6250d1ea9dc7888692 Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.924209 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.939210 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4c4bb7f-cb87-499e-a0a4-6447303aebf1","Type":"ContainerStarted","Data":"1e1ed05fd6d5947655b2d6cf57b1ad25bfc16327c3020e098cafc6e33ca8ac71"} Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.946126 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1864ef9-8a9e-4352-ad90-c70027c40eba","Type":"ContainerStarted","Data":"1014a53dd16127d4bef7a1e1ee60da924ce3a8e8c7ca6f6250d1ea9dc7888692"} Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.950832 4750 generic.go:334] "Generic (PLEG): container finished" podID="78e70311-917b-4069-bf21-c1764c8237e4" containerID="cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73" exitCode=0 Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.950862 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" event={"ID":"78e70311-917b-4069-bf21-c1764c8237e4","Type":"ContainerDied","Data":"cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73"} Oct 08 19:44:17 crc kubenswrapper[4750]: I1008 19:44:17.950891 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" event={"ID":"78e70311-917b-4069-bf21-c1764c8237e4","Type":"ContainerStarted","Data":"842047b4e8271de33bf19e265dddd68317143b38c1d675751f471b06cf200412"} Oct 08 19:44:18 crc kubenswrapper[4750]: I1008 19:44:18.968031 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4c4bb7f-cb87-499e-a0a4-6447303aebf1","Type":"ContainerStarted","Data":"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327"} Oct 08 19:44:18 crc kubenswrapper[4750]: I1008 19:44:18.969012 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerName="glance-httpd" containerID="cri-o://55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327" gracePeriod=30 Oct 08 19:44:18 crc kubenswrapper[4750]: I1008 19:44:18.969021 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4c4bb7f-cb87-499e-a0a4-6447303aebf1","Type":"ContainerStarted","Data":"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39"} Oct 08 19:44:18 crc kubenswrapper[4750]: I1008 19:44:18.968245 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerName="glance-log" containerID="cri-o://002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39" gracePeriod=30 Oct 08 19:44:18 crc kubenswrapper[4750]: I1008 19:44:18.980710 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1864ef9-8a9e-4352-ad90-c70027c40eba","Type":"ContainerStarted","Data":"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b"} Oct 08 19:44:18 crc kubenswrapper[4750]: I1008 19:44:18.980795 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1864ef9-8a9e-4352-ad90-c70027c40eba","Type":"ContainerStarted","Data":"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c"} Oct 08 19:44:18 crc kubenswrapper[4750]: I1008 19:44:18.991042 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" event={"ID":"78e70311-917b-4069-bf21-c1764c8237e4","Type":"ContainerStarted","Data":"e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab"} Oct 08 19:44:18 crc kubenswrapper[4750]: I1008 19:44:18.992586 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.003976 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.003941813 podStartE2EDuration="3.003941813s" podCreationTimestamp="2025-10-08 19:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:44:18.991843702 +0000 UTC m=+5614.904814715" watchObservedRunningTime="2025-10-08 19:44:19.003941813 +0000 UTC m=+5614.916912836" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.020289 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" podStartSLOduration=3.02026457 podStartE2EDuration="3.02026457s" podCreationTimestamp="2025-10-08 19:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:44:19.016530397 +0000 UTC m=+5614.929501410" watchObservedRunningTime="2025-10-08 19:44:19.02026457 +0000 UTC m=+5614.933235583" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.052102 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.052065943 podStartE2EDuration="3.052065943s" podCreationTimestamp="2025-10-08 19:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:44:19.044895974 +0000 UTC m=+5614.957866997" watchObservedRunningTime="2025-10-08 19:44:19.052065943 +0000 UTC m=+5614.965036966" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.580564 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.692704 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-logs\") pod \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.692826 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-combined-ca-bundle\") pod \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.692876 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-ceph\") pod \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.693010 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkdm2\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-kube-api-access-mkdm2\") pod \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.693060 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-httpd-run\") pod \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.693114 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-scripts\") pod \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.693165 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-config-data\") pod \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\" (UID: \"a4c4bb7f-cb87-499e-a0a4-6447303aebf1\") " Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.693740 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-logs" (OuterVolumeSpecName: "logs") pod "a4c4bb7f-cb87-499e-a0a4-6447303aebf1" (UID: "a4c4bb7f-cb87-499e-a0a4-6447303aebf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.694171 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a4c4bb7f-cb87-499e-a0a4-6447303aebf1" (UID: "a4c4bb7f-cb87-499e-a0a4-6447303aebf1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.701196 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-kube-api-access-mkdm2" (OuterVolumeSpecName: "kube-api-access-mkdm2") pod "a4c4bb7f-cb87-499e-a0a4-6447303aebf1" (UID: "a4c4bb7f-cb87-499e-a0a4-6447303aebf1"). InnerVolumeSpecName "kube-api-access-mkdm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.701830 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-scripts" (OuterVolumeSpecName: "scripts") pod "a4c4bb7f-cb87-499e-a0a4-6447303aebf1" (UID: "a4c4bb7f-cb87-499e-a0a4-6447303aebf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.703291 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-ceph" (OuterVolumeSpecName: "ceph") pod "a4c4bb7f-cb87-499e-a0a4-6447303aebf1" (UID: "a4c4bb7f-cb87-499e-a0a4-6447303aebf1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.730756 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4c4bb7f-cb87-499e-a0a4-6447303aebf1" (UID: "a4c4bb7f-cb87-499e-a0a4-6447303aebf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.769001 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-config-data" (OuterVolumeSpecName: "config-data") pod "a4c4bb7f-cb87-499e-a0a4-6447303aebf1" (UID: "a4c4bb7f-cb87-499e-a0a4-6447303aebf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.795401 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.795441 4750 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.795452 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkdm2\" (UniqueName: \"kubernetes.io/projected/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-kube-api-access-mkdm2\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.795463 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.795473 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.795482 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:19 crc kubenswrapper[4750]: I1008 19:44:19.795494 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4c4bb7f-cb87-499e-a0a4-6447303aebf1-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.007454 4750 generic.go:334] "Generic (PLEG): container finished" podID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerID="55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327" exitCode=143 Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.007513 4750 generic.go:334] "Generic (PLEG): container finished" podID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerID="002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39" exitCode=143 Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.008995 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.010917 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4c4bb7f-cb87-499e-a0a4-6447303aebf1","Type":"ContainerDied","Data":"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327"} Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.011038 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4c4bb7f-cb87-499e-a0a4-6447303aebf1","Type":"ContainerDied","Data":"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39"} Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.011054 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4c4bb7f-cb87-499e-a0a4-6447303aebf1","Type":"ContainerDied","Data":"1e1ed05fd6d5947655b2d6cf57b1ad25bfc16327c3020e098cafc6e33ca8ac71"} Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.011078 4750 scope.go:117] "RemoveContainer" containerID="55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.050482 4750 scope.go:117] "RemoveContainer" containerID="002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.059892 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.075789 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.086085 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:20 crc kubenswrapper[4750]: E1008 19:44:20.088436 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerName="glance-log" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.088477 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerName="glance-log" Oct 08 19:44:20 crc kubenswrapper[4750]: E1008 19:44:20.088512 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerName="glance-httpd" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.088523 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerName="glance-httpd" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.088887 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerName="glance-log" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.088961 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" containerName="glance-httpd" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.088448 4750 scope.go:117] "RemoveContainer" containerID="55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.090211 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: E1008 19:44:20.090485 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327\": container with ID starting with 55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327 not found: ID does not exist" containerID="55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.090662 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327"} err="failed to get container status \"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327\": rpc error: code = NotFound desc = could not find container \"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327\": container with ID starting with 55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327 not found: ID does not exist" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.090784 4750 scope.go:117] "RemoveContainer" containerID="002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39" Oct 08 19:44:20 crc kubenswrapper[4750]: E1008 19:44:20.092176 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39\": container with ID starting with 002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39 not found: ID does not exist" containerID="002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.092292 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39"} err="failed to get container status \"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39\": rpc error: code = NotFound desc = could not find container \"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39\": container with ID starting with 002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39 not found: ID does not exist" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.092524 4750 scope.go:117] "RemoveContainer" containerID="55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.094352 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.094611 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327"} err="failed to get container status \"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327\": rpc error: code = NotFound desc = could not find container \"55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327\": container with ID starting with 55582034ef300325a631fef3f0202c9c98b33f5820d42999c75a2b52eb156327 not found: ID does not exist" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.094720 4750 scope.go:117] "RemoveContainer" containerID="002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.095402 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39"} err="failed to get container status \"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39\": rpc error: code = NotFound desc = could not find container \"002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39\": container with ID starting with 002ef66335ec3930b2387d7b6a1aabc0e9dd9c23c860120dca46029c5091ed39 not found: ID does not exist" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.099520 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.202818 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-config-data\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.202896 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-ceph\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.202937 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-scripts\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.202975 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-logs\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.203020 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.203049 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.203081 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgl7s\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-kube-api-access-jgl7s\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.305027 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-scripts\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.305149 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-logs\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.305194 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.305229 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.305267 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgl7s\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-kube-api-access-jgl7s\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.305365 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-config-data\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.305413 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-ceph\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.306303 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-logs\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.306434 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.311644 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-config-data\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.316353 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-ceph\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.316522 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.323541 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-scripts\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.334565 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgl7s\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-kube-api-access-jgl7s\") pod \"glance-default-external-api-0\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.417478 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.592238 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:20 crc kubenswrapper[4750]: I1008 19:44:20.748332 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c4bb7f-cb87-499e-a0a4-6447303aebf1" path="/var/lib/kubelet/pods/a4c4bb7f-cb87-499e-a0a4-6447303aebf1/volumes" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.022137 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerName="glance-log" containerID="cri-o://5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c" gracePeriod=30 Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.022236 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerName="glance-httpd" containerID="cri-o://87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b" gracePeriod=30 Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.066070 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:44:21 crc kubenswrapper[4750]: W1008 19:44:21.070947 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67d6b591_2df8_40b3_8420_a20514693983.slice/crio-0718d318a6d97b888769dc756330838af4232356e416cab39896aacebfdebf05 WatchSource:0}: Error finding container 0718d318a6d97b888769dc756330838af4232356e416cab39896aacebfdebf05: Status 404 returned error can't find the container with id 0718d318a6d97b888769dc756330838af4232356e416cab39896aacebfdebf05 Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.773460 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.844053 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7ms2\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-kube-api-access-l7ms2\") pod \"d1864ef9-8a9e-4352-ad90-c70027c40eba\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.844134 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-ceph\") pod \"d1864ef9-8a9e-4352-ad90-c70027c40eba\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.844238 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-config-data\") pod \"d1864ef9-8a9e-4352-ad90-c70027c40eba\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.844342 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-combined-ca-bundle\") pod \"d1864ef9-8a9e-4352-ad90-c70027c40eba\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.844372 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-httpd-run\") pod \"d1864ef9-8a9e-4352-ad90-c70027c40eba\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.844434 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-logs\") pod \"d1864ef9-8a9e-4352-ad90-c70027c40eba\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.844521 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-scripts\") pod \"d1864ef9-8a9e-4352-ad90-c70027c40eba\" (UID: \"d1864ef9-8a9e-4352-ad90-c70027c40eba\") " Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.846901 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d1864ef9-8a9e-4352-ad90-c70027c40eba" (UID: "d1864ef9-8a9e-4352-ad90-c70027c40eba"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.847358 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-logs" (OuterVolumeSpecName: "logs") pod "d1864ef9-8a9e-4352-ad90-c70027c40eba" (UID: "d1864ef9-8a9e-4352-ad90-c70027c40eba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.848958 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-kube-api-access-l7ms2" (OuterVolumeSpecName: "kube-api-access-l7ms2") pod "d1864ef9-8a9e-4352-ad90-c70027c40eba" (UID: "d1864ef9-8a9e-4352-ad90-c70027c40eba"). InnerVolumeSpecName "kube-api-access-l7ms2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.850460 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-scripts" (OuterVolumeSpecName: "scripts") pod "d1864ef9-8a9e-4352-ad90-c70027c40eba" (UID: "d1864ef9-8a9e-4352-ad90-c70027c40eba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.851345 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-ceph" (OuterVolumeSpecName: "ceph") pod "d1864ef9-8a9e-4352-ad90-c70027c40eba" (UID: "d1864ef9-8a9e-4352-ad90-c70027c40eba"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.883977 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1864ef9-8a9e-4352-ad90-c70027c40eba" (UID: "d1864ef9-8a9e-4352-ad90-c70027c40eba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.902239 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-config-data" (OuterVolumeSpecName: "config-data") pod "d1864ef9-8a9e-4352-ad90-c70027c40eba" (UID: "d1864ef9-8a9e-4352-ad90-c70027c40eba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.946664 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.946716 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.946729 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1864ef9-8a9e-4352-ad90-c70027c40eba-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.946740 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.946749 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7ms2\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-kube-api-access-l7ms2\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.946762 4750 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d1864ef9-8a9e-4352-ad90-c70027c40eba-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:21 crc kubenswrapper[4750]: I1008 19:44:21.946770 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1864ef9-8a9e-4352-ad90-c70027c40eba-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.050881 4750 generic.go:334] "Generic (PLEG): container finished" podID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerID="87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b" exitCode=0 Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.050918 4750 generic.go:334] "Generic (PLEG): container finished" podID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerID="5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c" exitCode=143 Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.050938 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.050980 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1864ef9-8a9e-4352-ad90-c70027c40eba","Type":"ContainerDied","Data":"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b"} Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.051036 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1864ef9-8a9e-4352-ad90-c70027c40eba","Type":"ContainerDied","Data":"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c"} Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.051048 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1864ef9-8a9e-4352-ad90-c70027c40eba","Type":"ContainerDied","Data":"1014a53dd16127d4bef7a1e1ee60da924ce3a8e8c7ca6f6250d1ea9dc7888692"} Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.051067 4750 scope.go:117] "RemoveContainer" containerID="87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.059802 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67d6b591-2df8-40b3-8420-a20514693983","Type":"ContainerStarted","Data":"20994929e1b89554a51abf0b62233d6f842765fe7f942a6c9ef3f20cca9600ba"} Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.059852 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67d6b591-2df8-40b3-8420-a20514693983","Type":"ContainerStarted","Data":"0718d318a6d97b888769dc756330838af4232356e416cab39896aacebfdebf05"} Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.091732 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.102379 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.103857 4750 scope.go:117] "RemoveContainer" containerID="5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.116270 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:22 crc kubenswrapper[4750]: E1008 19:44:22.117088 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerName="glance-httpd" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.117200 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerName="glance-httpd" Oct 08 19:44:22 crc kubenswrapper[4750]: E1008 19:44:22.117317 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerName="glance-log" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.117392 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerName="glance-log" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.117733 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerName="glance-log" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.117859 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" containerName="glance-httpd" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.119227 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.123722 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.140384 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.180271 4750 scope.go:117] "RemoveContainer" containerID="87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b" Oct 08 19:44:22 crc kubenswrapper[4750]: E1008 19:44:22.186939 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b\": container with ID starting with 87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b not found: ID does not exist" containerID="87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.187004 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b"} err="failed to get container status \"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b\": rpc error: code = NotFound desc = could not find container \"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b\": container with ID starting with 87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b not found: ID does not exist" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.187045 4750 scope.go:117] "RemoveContainer" containerID="5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c" Oct 08 19:44:22 crc kubenswrapper[4750]: E1008 19:44:22.188755 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c\": container with ID starting with 5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c not found: ID does not exist" containerID="5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.188815 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c"} err="failed to get container status \"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c\": rpc error: code = NotFound desc = could not find container \"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c\": container with ID starting with 5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c not found: ID does not exist" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.188848 4750 scope.go:117] "RemoveContainer" containerID="87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.189781 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b"} err="failed to get container status \"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b\": rpc error: code = NotFound desc = could not find container \"87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b\": container with ID starting with 87a1063fe84363e123e00c1fe76647141b81e9f3728c360d5e438d4be886530b not found: ID does not exist" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.189807 4750 scope.go:117] "RemoveContainer" containerID="5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.190314 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c"} err="failed to get container status \"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c\": rpc error: code = NotFound desc = could not find container \"5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c\": container with ID starting with 5f1ba1f59c107b906e54623700d89b2618c1893a0682f1679436839bd4b3d23c not found: ID does not exist" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.255468 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.255726 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.255885 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.256215 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.256363 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5mrx\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-kube-api-access-g5mrx\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.256506 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.256690 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.358216 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.358285 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.358315 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.358366 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.358394 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5mrx\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-kube-api-access-g5mrx\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.358434 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.358466 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.359264 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.359610 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.369340 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.369340 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.369521 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.370378 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.379211 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5mrx\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-kube-api-access-g5mrx\") pod \"glance-default-internal-api-0\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.437732 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:22 crc kubenswrapper[4750]: I1008 19:44:22.745288 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1864ef9-8a9e-4352-ad90-c70027c40eba" path="/var/lib/kubelet/pods/d1864ef9-8a9e-4352-ad90-c70027c40eba/volumes" Oct 08 19:44:23 crc kubenswrapper[4750]: I1008 19:44:23.075466 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:44:23 crc kubenswrapper[4750]: I1008 19:44:23.077016 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67d6b591-2df8-40b3-8420-a20514693983","Type":"ContainerStarted","Data":"7b3a0f3d51e5a8df0638b1b46f09c8aa6d87e23ec16a26de860a29a8f3f6c675"} Oct 08 19:44:23 crc kubenswrapper[4750]: W1008 19:44:23.084071 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb76270_9267_460b_80fa_810d41aeb7fb.slice/crio-83825ab29d79bd312410b7f70af3b1b89a1685b23c923142773371f95917db98 WatchSource:0}: Error finding container 83825ab29d79bd312410b7f70af3b1b89a1685b23c923142773371f95917db98: Status 404 returned error can't find the container with id 83825ab29d79bd312410b7f70af3b1b89a1685b23c923142773371f95917db98 Oct 08 19:44:23 crc kubenswrapper[4750]: I1008 19:44:23.105436 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.105413838 podStartE2EDuration="3.105413838s" podCreationTimestamp="2025-10-08 19:44:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:44:23.102385762 +0000 UTC m=+5619.015356785" watchObservedRunningTime="2025-10-08 19:44:23.105413838 +0000 UTC m=+5619.018384851" Oct 08 19:44:24 crc kubenswrapper[4750]: I1008 19:44:24.091872 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4bb76270-9267-460b-80fa-810d41aeb7fb","Type":"ContainerStarted","Data":"0d2add7b7120c41614ab5f8f0b06197e17ae4119e1741f351fc99786e2fbbdf3"} Oct 08 19:44:24 crc kubenswrapper[4750]: I1008 19:44:24.092988 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4bb76270-9267-460b-80fa-810d41aeb7fb","Type":"ContainerStarted","Data":"83825ab29d79bd312410b7f70af3b1b89a1685b23c923142773371f95917db98"} Oct 08 19:44:25 crc kubenswrapper[4750]: I1008 19:44:25.109944 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4bb76270-9267-460b-80fa-810d41aeb7fb","Type":"ContainerStarted","Data":"c95f2218282619d05a7047a6fb4d8e2a9e9ade9e2329f6cecc789c7fee5230c4"} Oct 08 19:44:26 crc kubenswrapper[4750]: I1008 19:44:26.680007 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:44:26 crc kubenswrapper[4750]: I1008 19:44:26.701788 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.701758052 podStartE2EDuration="4.701758052s" podCreationTimestamp="2025-10-08 19:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:44:25.143898579 +0000 UTC m=+5621.056869592" watchObservedRunningTime="2025-10-08 19:44:26.701758052 +0000 UTC m=+5622.614729065" Oct 08 19:44:26 crc kubenswrapper[4750]: I1008 19:44:26.754977 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dd4959d75-22kgk"] Oct 08 19:44:26 crc kubenswrapper[4750]: I1008 19:44:26.755308 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" podUID="c146308f-d2f5-4ae6-92b9-289dd99922ad" containerName="dnsmasq-dns" containerID="cri-o://29ee4b013b9e27d657267adbbc929c79032c1184cba1c5f7bf9f7c489916a04c" gracePeriod=10 Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.162313 4750 generic.go:334] "Generic (PLEG): container finished" podID="c146308f-d2f5-4ae6-92b9-289dd99922ad" containerID="29ee4b013b9e27d657267adbbc929c79032c1184cba1c5f7bf9f7c489916a04c" exitCode=0 Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.163054 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" event={"ID":"c146308f-d2f5-4ae6-92b9-289dd99922ad","Type":"ContainerDied","Data":"29ee4b013b9e27d657267adbbc929c79032c1184cba1c5f7bf9f7c489916a04c"} Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.321466 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.475719 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-sb\") pod \"c146308f-d2f5-4ae6-92b9-289dd99922ad\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.475843 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-dns-svc\") pod \"c146308f-d2f5-4ae6-92b9-289dd99922ad\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.476017 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87nkp\" (UniqueName: \"kubernetes.io/projected/c146308f-d2f5-4ae6-92b9-289dd99922ad-kube-api-access-87nkp\") pod \"c146308f-d2f5-4ae6-92b9-289dd99922ad\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.476099 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-nb\") pod \"c146308f-d2f5-4ae6-92b9-289dd99922ad\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.476218 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-config\") pod \"c146308f-d2f5-4ae6-92b9-289dd99922ad\" (UID: \"c146308f-d2f5-4ae6-92b9-289dd99922ad\") " Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.487965 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c146308f-d2f5-4ae6-92b9-289dd99922ad-kube-api-access-87nkp" (OuterVolumeSpecName: "kube-api-access-87nkp") pod "c146308f-d2f5-4ae6-92b9-289dd99922ad" (UID: "c146308f-d2f5-4ae6-92b9-289dd99922ad"). InnerVolumeSpecName "kube-api-access-87nkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.530110 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-config" (OuterVolumeSpecName: "config") pod "c146308f-d2f5-4ae6-92b9-289dd99922ad" (UID: "c146308f-d2f5-4ae6-92b9-289dd99922ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.544574 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c146308f-d2f5-4ae6-92b9-289dd99922ad" (UID: "c146308f-d2f5-4ae6-92b9-289dd99922ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.546543 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c146308f-d2f5-4ae6-92b9-289dd99922ad" (UID: "c146308f-d2f5-4ae6-92b9-289dd99922ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.550015 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c146308f-d2f5-4ae6-92b9-289dd99922ad" (UID: "c146308f-d2f5-4ae6-92b9-289dd99922ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.578455 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87nkp\" (UniqueName: \"kubernetes.io/projected/c146308f-d2f5-4ae6-92b9-289dd99922ad-kube-api-access-87nkp\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.578492 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.578504 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.578513 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:27 crc kubenswrapper[4750]: I1008 19:44:27.578524 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c146308f-d2f5-4ae6-92b9-289dd99922ad-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:28 crc kubenswrapper[4750]: I1008 19:44:28.177820 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" event={"ID":"c146308f-d2f5-4ae6-92b9-289dd99922ad","Type":"ContainerDied","Data":"e0da70fda9afa5fccafaa4202374fbfa03c108f342c6eef60d934351f37c700e"} Oct 08 19:44:28 crc kubenswrapper[4750]: I1008 19:44:28.177912 4750 scope.go:117] "RemoveContainer" containerID="29ee4b013b9e27d657267adbbc929c79032c1184cba1c5f7bf9f7c489916a04c" Oct 08 19:44:28 crc kubenswrapper[4750]: I1008 19:44:28.177972 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd4959d75-22kgk" Oct 08 19:44:28 crc kubenswrapper[4750]: I1008 19:44:28.208697 4750 scope.go:117] "RemoveContainer" containerID="116317564a520144e196ebb5c07616deaf7fdf6e10ad027224fb0f77947b76ea" Oct 08 19:44:28 crc kubenswrapper[4750]: I1008 19:44:28.238217 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dd4959d75-22kgk"] Oct 08 19:44:28 crc kubenswrapper[4750]: I1008 19:44:28.247780 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dd4959d75-22kgk"] Oct 08 19:44:28 crc kubenswrapper[4750]: I1008 19:44:28.755686 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c146308f-d2f5-4ae6-92b9-289dd99922ad" path="/var/lib/kubelet/pods/c146308f-d2f5-4ae6-92b9-289dd99922ad/volumes" Oct 08 19:44:29 crc kubenswrapper[4750]: I1008 19:44:29.707018 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:44:29 crc kubenswrapper[4750]: I1008 19:44:29.707106 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:44:29 crc kubenswrapper[4750]: I1008 19:44:29.707171 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:44:29 crc kubenswrapper[4750]: I1008 19:44:29.708270 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1499c5346a2a6057a03d90e92d9fde974f96055257fa4f56cf44bb72d0e5bf8e"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:44:29 crc kubenswrapper[4750]: I1008 19:44:29.708369 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://1499c5346a2a6057a03d90e92d9fde974f96055257fa4f56cf44bb72d0e5bf8e" gracePeriod=600 Oct 08 19:44:30 crc kubenswrapper[4750]: I1008 19:44:30.207184 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="1499c5346a2a6057a03d90e92d9fde974f96055257fa4f56cf44bb72d0e5bf8e" exitCode=0 Oct 08 19:44:30 crc kubenswrapper[4750]: I1008 19:44:30.207275 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"1499c5346a2a6057a03d90e92d9fde974f96055257fa4f56cf44bb72d0e5bf8e"} Oct 08 19:44:30 crc kubenswrapper[4750]: I1008 19:44:30.207640 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76"} Oct 08 19:44:30 crc kubenswrapper[4750]: I1008 19:44:30.207683 4750 scope.go:117] "RemoveContainer" containerID="42c2a7a6ce3026c9ad8c8eff781fbf4cba179ca631dc8f4fcdd0691647588e7a" Oct 08 19:44:30 crc kubenswrapper[4750]: I1008 19:44:30.417907 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 19:44:30 crc kubenswrapper[4750]: I1008 19:44:30.417977 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 19:44:30 crc kubenswrapper[4750]: I1008 19:44:30.469963 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 19:44:30 crc kubenswrapper[4750]: I1008 19:44:30.477810 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 19:44:31 crc kubenswrapper[4750]: I1008 19:44:31.228840 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 19:44:31 crc kubenswrapper[4750]: I1008 19:44:31.228916 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 19:44:32 crc kubenswrapper[4750]: I1008 19:44:32.438840 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:32 crc kubenswrapper[4750]: I1008 19:44:32.442277 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:32 crc kubenswrapper[4750]: I1008 19:44:32.470931 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:32 crc kubenswrapper[4750]: I1008 19:44:32.496592 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:33 crc kubenswrapper[4750]: I1008 19:44:33.251867 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:33 crc kubenswrapper[4750]: I1008 19:44:33.251966 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:33 crc kubenswrapper[4750]: I1008 19:44:33.411787 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 19:44:33 crc kubenswrapper[4750]: I1008 19:44:33.411940 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 19:44:33 crc kubenswrapper[4750]: I1008 19:44:33.420616 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 19:44:35 crc kubenswrapper[4750]: I1008 19:44:35.458519 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:35 crc kubenswrapper[4750]: I1008 19:44:35.458889 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 19:44:35 crc kubenswrapper[4750]: I1008 19:44:35.491618 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.040002 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-pjwpg"] Oct 08 19:44:42 crc kubenswrapper[4750]: E1008 19:44:42.041902 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c146308f-d2f5-4ae6-92b9-289dd99922ad" containerName="dnsmasq-dns" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.041918 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c146308f-d2f5-4ae6-92b9-289dd99922ad" containerName="dnsmasq-dns" Oct 08 19:44:42 crc kubenswrapper[4750]: E1008 19:44:42.041942 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c146308f-d2f5-4ae6-92b9-289dd99922ad" containerName="init" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.041948 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c146308f-d2f5-4ae6-92b9-289dd99922ad" containerName="init" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.042133 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c146308f-d2f5-4ae6-92b9-289dd99922ad" containerName="dnsmasq-dns" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.042787 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pjwpg" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.050808 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pjwpg"] Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.186291 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbzj\" (UniqueName: \"kubernetes.io/projected/7698a080-0919-4c32-af78-fc68c8366657-kube-api-access-wsbzj\") pod \"placement-db-create-pjwpg\" (UID: \"7698a080-0919-4c32-af78-fc68c8366657\") " pod="openstack/placement-db-create-pjwpg" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.289082 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsbzj\" (UniqueName: \"kubernetes.io/projected/7698a080-0919-4c32-af78-fc68c8366657-kube-api-access-wsbzj\") pod \"placement-db-create-pjwpg\" (UID: \"7698a080-0919-4c32-af78-fc68c8366657\") " pod="openstack/placement-db-create-pjwpg" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.310908 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsbzj\" (UniqueName: \"kubernetes.io/projected/7698a080-0919-4c32-af78-fc68c8366657-kube-api-access-wsbzj\") pod \"placement-db-create-pjwpg\" (UID: \"7698a080-0919-4c32-af78-fc68c8366657\") " pod="openstack/placement-db-create-pjwpg" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.364171 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pjwpg" Oct 08 19:44:42 crc kubenswrapper[4750]: I1008 19:44:42.822465 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pjwpg"] Oct 08 19:44:43 crc kubenswrapper[4750]: I1008 19:44:43.351368 4750 generic.go:334] "Generic (PLEG): container finished" podID="7698a080-0919-4c32-af78-fc68c8366657" containerID="39994094c13130c8f2f1ef3a4a7bb987fa44919de60ae130958a98428ffe21d6" exitCode=0 Oct 08 19:44:43 crc kubenswrapper[4750]: I1008 19:44:43.351429 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pjwpg" event={"ID":"7698a080-0919-4c32-af78-fc68c8366657","Type":"ContainerDied","Data":"39994094c13130c8f2f1ef3a4a7bb987fa44919de60ae130958a98428ffe21d6"} Oct 08 19:44:43 crc kubenswrapper[4750]: I1008 19:44:43.351465 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pjwpg" event={"ID":"7698a080-0919-4c32-af78-fc68c8366657","Type":"ContainerStarted","Data":"6108143cecd80f0235e6c0ef698331fb03e199a9fe246b0309d799fc69047908"} Oct 08 19:44:44 crc kubenswrapper[4750]: I1008 19:44:44.743594 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pjwpg" Oct 08 19:44:44 crc kubenswrapper[4750]: I1008 19:44:44.845513 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsbzj\" (UniqueName: \"kubernetes.io/projected/7698a080-0919-4c32-af78-fc68c8366657-kube-api-access-wsbzj\") pod \"7698a080-0919-4c32-af78-fc68c8366657\" (UID: \"7698a080-0919-4c32-af78-fc68c8366657\") " Oct 08 19:44:44 crc kubenswrapper[4750]: I1008 19:44:44.856159 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7698a080-0919-4c32-af78-fc68c8366657-kube-api-access-wsbzj" (OuterVolumeSpecName: "kube-api-access-wsbzj") pod "7698a080-0919-4c32-af78-fc68c8366657" (UID: "7698a080-0919-4c32-af78-fc68c8366657"). InnerVolumeSpecName "kube-api-access-wsbzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:44 crc kubenswrapper[4750]: I1008 19:44:44.948756 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsbzj\" (UniqueName: \"kubernetes.io/projected/7698a080-0919-4c32-af78-fc68c8366657-kube-api-access-wsbzj\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:45 crc kubenswrapper[4750]: I1008 19:44:45.370819 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pjwpg" event={"ID":"7698a080-0919-4c32-af78-fc68c8366657","Type":"ContainerDied","Data":"6108143cecd80f0235e6c0ef698331fb03e199a9fe246b0309d799fc69047908"} Oct 08 19:44:45 crc kubenswrapper[4750]: I1008 19:44:45.370873 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6108143cecd80f0235e6c0ef698331fb03e199a9fe246b0309d799fc69047908" Oct 08 19:44:45 crc kubenswrapper[4750]: I1008 19:44:45.371333 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pjwpg" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.188184 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d0fe-account-create-rj8v8"] Oct 08 19:44:52 crc kubenswrapper[4750]: E1008 19:44:52.189371 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7698a080-0919-4c32-af78-fc68c8366657" containerName="mariadb-database-create" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.189387 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7698a080-0919-4c32-af78-fc68c8366657" containerName="mariadb-database-create" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.189613 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7698a080-0919-4c32-af78-fc68c8366657" containerName="mariadb-database-create" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.190416 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d0fe-account-create-rj8v8" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.195737 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.202337 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d0fe-account-create-rj8v8"] Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.303575 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97kpv\" (UniqueName: \"kubernetes.io/projected/a5b608fd-2863-4703-8c27-d35888393aeb-kube-api-access-97kpv\") pod \"placement-d0fe-account-create-rj8v8\" (UID: \"a5b608fd-2863-4703-8c27-d35888393aeb\") " pod="openstack/placement-d0fe-account-create-rj8v8" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.405883 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97kpv\" (UniqueName: \"kubernetes.io/projected/a5b608fd-2863-4703-8c27-d35888393aeb-kube-api-access-97kpv\") pod \"placement-d0fe-account-create-rj8v8\" (UID: \"a5b608fd-2863-4703-8c27-d35888393aeb\") " pod="openstack/placement-d0fe-account-create-rj8v8" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.428133 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97kpv\" (UniqueName: \"kubernetes.io/projected/a5b608fd-2863-4703-8c27-d35888393aeb-kube-api-access-97kpv\") pod \"placement-d0fe-account-create-rj8v8\" (UID: \"a5b608fd-2863-4703-8c27-d35888393aeb\") " pod="openstack/placement-d0fe-account-create-rj8v8" Oct 08 19:44:52 crc kubenswrapper[4750]: I1008 19:44:52.529642 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d0fe-account-create-rj8v8" Oct 08 19:44:53 crc kubenswrapper[4750]: I1008 19:44:53.105176 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d0fe-account-create-rj8v8"] Oct 08 19:44:53 crc kubenswrapper[4750]: I1008 19:44:53.490803 4750 generic.go:334] "Generic (PLEG): container finished" podID="a5b608fd-2863-4703-8c27-d35888393aeb" containerID="b1ea89a72a627d646a4a35429444710d584da64d9074c7d079cb546b47a28223" exitCode=0 Oct 08 19:44:53 crc kubenswrapper[4750]: I1008 19:44:53.490885 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d0fe-account-create-rj8v8" event={"ID":"a5b608fd-2863-4703-8c27-d35888393aeb","Type":"ContainerDied","Data":"b1ea89a72a627d646a4a35429444710d584da64d9074c7d079cb546b47a28223"} Oct 08 19:44:53 crc kubenswrapper[4750]: I1008 19:44:53.490937 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d0fe-account-create-rj8v8" event={"ID":"a5b608fd-2863-4703-8c27-d35888393aeb","Type":"ContainerStarted","Data":"26f0a93420d2a71bba3ce9ed96d8b3c878c702711e35d788ef2638ae313a4083"} Oct 08 19:44:54 crc kubenswrapper[4750]: I1008 19:44:54.861674 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d0fe-account-create-rj8v8" Oct 08 19:44:54 crc kubenswrapper[4750]: I1008 19:44:54.964787 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97kpv\" (UniqueName: \"kubernetes.io/projected/a5b608fd-2863-4703-8c27-d35888393aeb-kube-api-access-97kpv\") pod \"a5b608fd-2863-4703-8c27-d35888393aeb\" (UID: \"a5b608fd-2863-4703-8c27-d35888393aeb\") " Oct 08 19:44:54 crc kubenswrapper[4750]: I1008 19:44:54.971762 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b608fd-2863-4703-8c27-d35888393aeb-kube-api-access-97kpv" (OuterVolumeSpecName: "kube-api-access-97kpv") pod "a5b608fd-2863-4703-8c27-d35888393aeb" (UID: "a5b608fd-2863-4703-8c27-d35888393aeb"). InnerVolumeSpecName "kube-api-access-97kpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:44:55 crc kubenswrapper[4750]: I1008 19:44:55.068244 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97kpv\" (UniqueName: \"kubernetes.io/projected/a5b608fd-2863-4703-8c27-d35888393aeb-kube-api-access-97kpv\") on node \"crc\" DevicePath \"\"" Oct 08 19:44:55 crc kubenswrapper[4750]: I1008 19:44:55.517864 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d0fe-account-create-rj8v8" event={"ID":"a5b608fd-2863-4703-8c27-d35888393aeb","Type":"ContainerDied","Data":"26f0a93420d2a71bba3ce9ed96d8b3c878c702711e35d788ef2638ae313a4083"} Oct 08 19:44:55 crc kubenswrapper[4750]: I1008 19:44:55.517937 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26f0a93420d2a71bba3ce9ed96d8b3c878c702711e35d788ef2638ae313a4083" Oct 08 19:44:55 crc kubenswrapper[4750]: I1008 19:44:55.517943 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d0fe-account-create-rj8v8" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.539905 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f476f79f5-78nfg"] Oct 08 19:44:57 crc kubenswrapper[4750]: E1008 19:44:57.540984 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b608fd-2863-4703-8c27-d35888393aeb" containerName="mariadb-account-create" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.541006 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b608fd-2863-4703-8c27-d35888393aeb" containerName="mariadb-account-create" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.541422 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b608fd-2863-4703-8c27-d35888393aeb" containerName="mariadb-account-create" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.543510 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.562702 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-69hsp"] Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.566589 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.592560 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f476f79f5-78nfg"] Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.593621 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nm5tl" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.594845 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.596202 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623072 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-scripts\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623120 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-logs\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623147 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-sb\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623183 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-dns-svc\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623222 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-config-data\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623240 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnjns\" (UniqueName: \"kubernetes.io/projected/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-kube-api-access-nnjns\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623258 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-config\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623384 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtvh\" (UniqueName: \"kubernetes.io/projected/217c91ff-d9a5-4349-a5a8-36e593581c92-kube-api-access-frtvh\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623407 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-combined-ca-bundle\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.623437 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-nb\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.631848 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-69hsp"] Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.726435 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-config-data\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.726497 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnjns\" (UniqueName: \"kubernetes.io/projected/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-kube-api-access-nnjns\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.726531 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-config\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.726657 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frtvh\" (UniqueName: \"kubernetes.io/projected/217c91ff-d9a5-4349-a5a8-36e593581c92-kube-api-access-frtvh\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.726705 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-combined-ca-bundle\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.726780 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-nb\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.726925 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-scripts\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.726958 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-logs\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.727002 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-sb\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.727087 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-dns-svc\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.728285 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-dns-svc\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.729338 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-config\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.730749 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-nb\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.730745 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-logs\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.731774 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-sb\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.735721 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-config-data\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.736006 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-combined-ca-bundle\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.746216 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-scripts\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.746810 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnjns\" (UniqueName: \"kubernetes.io/projected/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-kube-api-access-nnjns\") pod \"placement-db-sync-69hsp\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.757761 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtvh\" (UniqueName: \"kubernetes.io/projected/217c91ff-d9a5-4349-a5a8-36e593581c92-kube-api-access-frtvh\") pod \"dnsmasq-dns-6f476f79f5-78nfg\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.902141 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:44:57 crc kubenswrapper[4750]: I1008 19:44:57.921771 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-69hsp" Oct 08 19:44:58 crc kubenswrapper[4750]: I1008 19:44:58.394360 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f476f79f5-78nfg"] Oct 08 19:44:58 crc kubenswrapper[4750]: I1008 19:44:58.467899 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-69hsp"] Oct 08 19:44:58 crc kubenswrapper[4750]: W1008 19:44:58.470765 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaadb2ce3_1c9b_4af1_b4ad_3e310b2778ee.slice/crio-8229a7a5c8724f2dbd39f7a9b3cfafd706d7356260b97b1f0f6521e6b8de3415 WatchSource:0}: Error finding container 8229a7a5c8724f2dbd39f7a9b3cfafd706d7356260b97b1f0f6521e6b8de3415: Status 404 returned error can't find the container with id 8229a7a5c8724f2dbd39f7a9b3cfafd706d7356260b97b1f0f6521e6b8de3415 Oct 08 19:44:58 crc kubenswrapper[4750]: I1008 19:44:58.576711 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-69hsp" event={"ID":"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee","Type":"ContainerStarted","Data":"8229a7a5c8724f2dbd39f7a9b3cfafd706d7356260b97b1f0f6521e6b8de3415"} Oct 08 19:44:58 crc kubenswrapper[4750]: I1008 19:44:58.578298 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" event={"ID":"217c91ff-d9a5-4349-a5a8-36e593581c92","Type":"ContainerStarted","Data":"d3b1eed7e1b4580cecf94cfac7dc2215a66b42450ed2f7a0df731ce4c617d37d"} Oct 08 19:44:59 crc kubenswrapper[4750]: I1008 19:44:59.591928 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-69hsp" event={"ID":"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee","Type":"ContainerStarted","Data":"877da3de27e88bdb0a05e13b6550d49dd3a15973b1604f80d0fd3a6f246d0243"} Oct 08 19:44:59 crc kubenswrapper[4750]: I1008 19:44:59.594474 4750 generic.go:334] "Generic (PLEG): container finished" podID="217c91ff-d9a5-4349-a5a8-36e593581c92" containerID="9fdba9f656ebebf2ae6febcd687a5d74d71a878b9b3bdedb83276d7cadb4c96b" exitCode=0 Oct 08 19:44:59 crc kubenswrapper[4750]: I1008 19:44:59.594518 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" event={"ID":"217c91ff-d9a5-4349-a5a8-36e593581c92","Type":"ContainerDied","Data":"9fdba9f656ebebf2ae6febcd687a5d74d71a878b9b3bdedb83276d7cadb4c96b"} Oct 08 19:44:59 crc kubenswrapper[4750]: I1008 19:44:59.626064 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-69hsp" podStartSLOduration=2.62603282 podStartE2EDuration="2.62603282s" podCreationTimestamp="2025-10-08 19:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:44:59.621930988 +0000 UTC m=+5655.534902001" watchObservedRunningTime="2025-10-08 19:44:59.62603282 +0000 UTC m=+5655.539003843" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.170699 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2"] Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.175060 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.178101 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.179183 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.187840 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2"] Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.304843 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh6qj\" (UniqueName: \"kubernetes.io/projected/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-kube-api-access-rh6qj\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.305369 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-secret-volume\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.305427 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-config-volume\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.407456 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-secret-volume\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.407563 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-config-volume\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.407638 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh6qj\" (UniqueName: \"kubernetes.io/projected/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-kube-api-access-rh6qj\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.408835 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-config-volume\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.412114 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-secret-volume\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.434449 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh6qj\" (UniqueName: \"kubernetes.io/projected/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-kube-api-access-rh6qj\") pod \"collect-profiles-29332545-qrdt2\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.502494 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.639439 4750 generic.go:334] "Generic (PLEG): container finished" podID="aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" containerID="877da3de27e88bdb0a05e13b6550d49dd3a15973b1604f80d0fd3a6f246d0243" exitCode=0 Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.639654 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-69hsp" event={"ID":"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee","Type":"ContainerDied","Data":"877da3de27e88bdb0a05e13b6550d49dd3a15973b1604f80d0fd3a6f246d0243"} Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.643101 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" event={"ID":"217c91ff-d9a5-4349-a5a8-36e593581c92","Type":"ContainerStarted","Data":"b62c940475668778ef6696dae355e3efec8f1f3092dfa5566b06bbfe9eee04d7"} Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.643275 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.695593 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" podStartSLOduration=3.695541714 podStartE2EDuration="3.695541714s" podCreationTimestamp="2025-10-08 19:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:45:00.688395856 +0000 UTC m=+5656.601366899" watchObservedRunningTime="2025-10-08 19:45:00.695541714 +0000 UTC m=+5656.608512767" Oct 08 19:45:00 crc kubenswrapper[4750]: I1008 19:45:00.976828 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2"] Oct 08 19:45:00 crc kubenswrapper[4750]: W1008 19:45:00.994625 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c05aa01_f19d_4035_8bc4_d0f4ff0c9cea.slice/crio-9917477b8bd591adbaf4122ccc6b16233332c812f59e22146a27fc8e5f897ab4 WatchSource:0}: Error finding container 9917477b8bd591adbaf4122ccc6b16233332c812f59e22146a27fc8e5f897ab4: Status 404 returned error can't find the container with id 9917477b8bd591adbaf4122ccc6b16233332c812f59e22146a27fc8e5f897ab4 Oct 08 19:45:01 crc kubenswrapper[4750]: I1008 19:45:01.655537 4750 generic.go:334] "Generic (PLEG): container finished" podID="1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea" containerID="059f934c1a316501130d1d73a2c099cb2a834a707c5e8e6ff24ca5201d5218e0" exitCode=0 Oct 08 19:45:01 crc kubenswrapper[4750]: I1008 19:45:01.655675 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" event={"ID":"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea","Type":"ContainerDied","Data":"059f934c1a316501130d1d73a2c099cb2a834a707c5e8e6ff24ca5201d5218e0"} Oct 08 19:45:01 crc kubenswrapper[4750]: I1008 19:45:01.656176 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" event={"ID":"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea","Type":"ContainerStarted","Data":"9917477b8bd591adbaf4122ccc6b16233332c812f59e22146a27fc8e5f897ab4"} Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.068920 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-69hsp" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.144750 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-logs\") pod \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.145021 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-config-data\") pod \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.145120 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-combined-ca-bundle\") pod \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.145579 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-logs" (OuterVolumeSpecName: "logs") pod "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" (UID: "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.146139 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-scripts\") pod \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.146299 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnjns\" (UniqueName: \"kubernetes.io/projected/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-kube-api-access-nnjns\") pod \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\" (UID: \"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee\") " Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.146855 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.153434 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-kube-api-access-nnjns" (OuterVolumeSpecName: "kube-api-access-nnjns") pod "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" (UID: "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee"). InnerVolumeSpecName "kube-api-access-nnjns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.164327 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-scripts" (OuterVolumeSpecName: "scripts") pod "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" (UID: "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.177853 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" (UID: "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.179367 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-config-data" (OuterVolumeSpecName: "config-data") pod "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" (UID: "aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.249283 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnjns\" (UniqueName: \"kubernetes.io/projected/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-kube-api-access-nnjns\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.249347 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.249362 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.249378 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.668130 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-69hsp" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.668126 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-69hsp" event={"ID":"aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee","Type":"ContainerDied","Data":"8229a7a5c8724f2dbd39f7a9b3cfafd706d7356260b97b1f0f6521e6b8de3415"} Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.668420 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8229a7a5c8724f2dbd39f7a9b3cfafd706d7356260b97b1f0f6521e6b8de3415" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.759815 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5ffd6988b6-bt9qf"] Oct 08 19:45:02 crc kubenswrapper[4750]: E1008 19:45:02.760405 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" containerName="placement-db-sync" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.760427 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" containerName="placement-db-sync" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.760651 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" containerName="placement-db-sync" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.761838 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.764123 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nm5tl" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.765130 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.765207 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.773363 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ffd6988b6-bt9qf"] Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.867905 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-config-data\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.868497 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-scripts\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.868523 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-logs\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.868586 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-combined-ca-bundle\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.868654 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pdkx\" (UniqueName: \"kubernetes.io/projected/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-kube-api-access-5pdkx\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.970017 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pdkx\" (UniqueName: \"kubernetes.io/projected/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-kube-api-access-5pdkx\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.970103 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-config-data\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.970168 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-scripts\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.970188 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-logs\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.970244 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-combined-ca-bundle\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.970838 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-logs\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.976785 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-combined-ca-bundle\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.976899 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-config-data\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.983055 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-scripts\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:02 crc kubenswrapper[4750]: I1008 19:45:02.988347 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pdkx\" (UniqueName: \"kubernetes.io/projected/e9bb90a8-f0b9-48aa-94af-133c5ca6a3da-kube-api-access-5pdkx\") pod \"placement-5ffd6988b6-bt9qf\" (UID: \"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da\") " pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.078011 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.102803 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.174592 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-config-volume\") pod \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.174811 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-secret-volume\") pod \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.174923 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh6qj\" (UniqueName: \"kubernetes.io/projected/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-kube-api-access-rh6qj\") pod \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\" (UID: \"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea\") " Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.177263 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea" (UID: "1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.180750 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-kube-api-access-rh6qj" (OuterVolumeSpecName: "kube-api-access-rh6qj") pod "1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea" (UID: "1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea"). InnerVolumeSpecName "kube-api-access-rh6qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.182718 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea" (UID: "1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.277211 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh6qj\" (UniqueName: \"kubernetes.io/projected/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-kube-api-access-rh6qj\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.277258 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.277268 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.410711 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ffd6988b6-bt9qf"] Oct 08 19:45:03 crc kubenswrapper[4750]: W1008 19:45:03.416596 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9bb90a8_f0b9_48aa_94af_133c5ca6a3da.slice/crio-710251c2325db94097bd9d91859ee923cbb11913ab082c2db2cfdaba2367a316 WatchSource:0}: Error finding container 710251c2325db94097bd9d91859ee923cbb11913ab082c2db2cfdaba2367a316: Status 404 returned error can't find the container with id 710251c2325db94097bd9d91859ee923cbb11913ab082c2db2cfdaba2367a316 Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.682371 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.682364 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332545-qrdt2" event={"ID":"1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea","Type":"ContainerDied","Data":"9917477b8bd591adbaf4122ccc6b16233332c812f59e22146a27fc8e5f897ab4"} Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.682623 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9917477b8bd591adbaf4122ccc6b16233332c812f59e22146a27fc8e5f897ab4" Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.690090 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6988b6-bt9qf" event={"ID":"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da","Type":"ContainerStarted","Data":"ae9a26087e28cffb1ac4c355d48a94aa32be137f25bbc83b570906cf275ceb85"} Oct 08 19:45:03 crc kubenswrapper[4750]: I1008 19:45:03.690147 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6988b6-bt9qf" event={"ID":"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da","Type":"ContainerStarted","Data":"710251c2325db94097bd9d91859ee923cbb11913ab082c2db2cfdaba2367a316"} Oct 08 19:45:04 crc kubenswrapper[4750]: I1008 19:45:04.159158 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq"] Oct 08 19:45:04 crc kubenswrapper[4750]: I1008 19:45:04.166134 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332500-qrbpq"] Oct 08 19:45:04 crc kubenswrapper[4750]: I1008 19:45:04.704666 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6988b6-bt9qf" event={"ID":"e9bb90a8-f0b9-48aa-94af-133c5ca6a3da","Type":"ContainerStarted","Data":"ec3d514ff619ebece791863abc6c9a7c2357c017ec08f8d9529bcd42369b8c46"} Oct 08 19:45:04 crc kubenswrapper[4750]: I1008 19:45:04.705005 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:04 crc kubenswrapper[4750]: I1008 19:45:04.705242 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:04 crc kubenswrapper[4750]: I1008 19:45:04.733038 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5ffd6988b6-bt9qf" podStartSLOduration=2.7330090030000003 podStartE2EDuration="2.733009003s" podCreationTimestamp="2025-10-08 19:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:45:04.729184828 +0000 UTC m=+5660.642155851" watchObservedRunningTime="2025-10-08 19:45:04.733009003 +0000 UTC m=+5660.645980026" Oct 08 19:45:04 crc kubenswrapper[4750]: I1008 19:45:04.749159 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8befc693-d921-4d73-88f0-67fde14ff01a" path="/var/lib/kubelet/pods/8befc693-d921-4d73-88f0-67fde14ff01a/volumes" Oct 08 19:45:07 crc kubenswrapper[4750]: I1008 19:45:07.904038 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:45:07 crc kubenswrapper[4750]: I1008 19:45:07.988295 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8646f84cbc-d7cc8"] Oct 08 19:45:07 crc kubenswrapper[4750]: I1008 19:45:07.988780 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" podUID="78e70311-917b-4069-bf21-c1764c8237e4" containerName="dnsmasq-dns" containerID="cri-o://e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab" gracePeriod=10 Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.600355 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.701956 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l2cd\" (UniqueName: \"kubernetes.io/projected/78e70311-917b-4069-bf21-c1764c8237e4-kube-api-access-5l2cd\") pod \"78e70311-917b-4069-bf21-c1764c8237e4\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.702098 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-nb\") pod \"78e70311-917b-4069-bf21-c1764c8237e4\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.702171 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-sb\") pod \"78e70311-917b-4069-bf21-c1764c8237e4\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.702210 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-config\") pod \"78e70311-917b-4069-bf21-c1764c8237e4\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.702242 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-dns-svc\") pod \"78e70311-917b-4069-bf21-c1764c8237e4\" (UID: \"78e70311-917b-4069-bf21-c1764c8237e4\") " Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.723067 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78e70311-917b-4069-bf21-c1764c8237e4-kube-api-access-5l2cd" (OuterVolumeSpecName: "kube-api-access-5l2cd") pod "78e70311-917b-4069-bf21-c1764c8237e4" (UID: "78e70311-917b-4069-bf21-c1764c8237e4"). InnerVolumeSpecName "kube-api-access-5l2cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.749512 4750 generic.go:334] "Generic (PLEG): container finished" podID="78e70311-917b-4069-bf21-c1764c8237e4" containerID="e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab" exitCode=0 Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.749635 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.756959 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "78e70311-917b-4069-bf21-c1764c8237e4" (UID: "78e70311-917b-4069-bf21-c1764c8237e4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.758756 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "78e70311-917b-4069-bf21-c1764c8237e4" (UID: "78e70311-917b-4069-bf21-c1764c8237e4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.769168 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78e70311-917b-4069-bf21-c1764c8237e4" (UID: "78e70311-917b-4069-bf21-c1764c8237e4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.771228 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-config" (OuterVolumeSpecName: "config") pod "78e70311-917b-4069-bf21-c1764c8237e4" (UID: "78e70311-917b-4069-bf21-c1764c8237e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.804285 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l2cd\" (UniqueName: \"kubernetes.io/projected/78e70311-917b-4069-bf21-c1764c8237e4-kube-api-access-5l2cd\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.804323 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.804335 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.804347 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.804360 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78e70311-917b-4069-bf21-c1764c8237e4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.804449 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" event={"ID":"78e70311-917b-4069-bf21-c1764c8237e4","Type":"ContainerDied","Data":"e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab"} Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.804531 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8646f84cbc-d7cc8" event={"ID":"78e70311-917b-4069-bf21-c1764c8237e4","Type":"ContainerDied","Data":"842047b4e8271de33bf19e265dddd68317143b38c1d675751f471b06cf200412"} Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.804671 4750 scope.go:117] "RemoveContainer" containerID="e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.829596 4750 scope.go:117] "RemoveContainer" containerID="cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.856847 4750 scope.go:117] "RemoveContainer" containerID="e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab" Oct 08 19:45:08 crc kubenswrapper[4750]: E1008 19:45:08.857586 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab\": container with ID starting with e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab not found: ID does not exist" containerID="e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.857628 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab"} err="failed to get container status \"e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab\": rpc error: code = NotFound desc = could not find container \"e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab\": container with ID starting with e209a43efb15ffc079ad99c8e4b6b37aceb9f6699f73d2ccf43cf7b0ac58d2ab not found: ID does not exist" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.857661 4750 scope.go:117] "RemoveContainer" containerID="cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73" Oct 08 19:45:08 crc kubenswrapper[4750]: E1008 19:45:08.858253 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73\": container with ID starting with cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73 not found: ID does not exist" containerID="cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73" Oct 08 19:45:08 crc kubenswrapper[4750]: I1008 19:45:08.858323 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73"} err="failed to get container status \"cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73\": rpc error: code = NotFound desc = could not find container \"cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73\": container with ID starting with cfd19482770f8f0ef7978d861d97c7f2255feff73bd4e75a5a58985f12ff5a73 not found: ID does not exist" Oct 08 19:45:09 crc kubenswrapper[4750]: I1008 19:45:09.084364 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8646f84cbc-d7cc8"] Oct 08 19:45:09 crc kubenswrapper[4750]: I1008 19:45:09.096018 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8646f84cbc-d7cc8"] Oct 08 19:45:10 crc kubenswrapper[4750]: I1008 19:45:10.753033 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78e70311-917b-4069-bf21-c1764c8237e4" path="/var/lib/kubelet/pods/78e70311-917b-4069-bf21-c1764c8237e4/volumes" Oct 08 19:45:34 crc kubenswrapper[4750]: I1008 19:45:34.200609 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:34 crc kubenswrapper[4750]: I1008 19:45:34.202761 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ffd6988b6-bt9qf" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.482405 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7kjsn"] Oct 08 19:45:56 crc kubenswrapper[4750]: E1008 19:45:56.483806 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e70311-917b-4069-bf21-c1764c8237e4" containerName="init" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.483825 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e70311-917b-4069-bf21-c1764c8237e4" containerName="init" Oct 08 19:45:56 crc kubenswrapper[4750]: E1008 19:45:56.483857 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e70311-917b-4069-bf21-c1764c8237e4" containerName="dnsmasq-dns" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.483866 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e70311-917b-4069-bf21-c1764c8237e4" containerName="dnsmasq-dns" Oct 08 19:45:56 crc kubenswrapper[4750]: E1008 19:45:56.483936 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea" containerName="collect-profiles" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.483947 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea" containerName="collect-profiles" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.484175 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e70311-917b-4069-bf21-c1764c8237e4" containerName="dnsmasq-dns" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.484203 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c05aa01-f19d-4035-8bc4-d0f4ff0c9cea" containerName="collect-profiles" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.485876 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.492038 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kjsn"] Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.670258 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q46cs\" (UniqueName: \"kubernetes.io/projected/1059cf31-4768-4ae6-8f03-d9c1fa125717-kube-api-access-q46cs\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.670331 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-utilities\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.670384 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-catalog-content\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.772151 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q46cs\" (UniqueName: \"kubernetes.io/projected/1059cf31-4768-4ae6-8f03-d9c1fa125717-kube-api-access-q46cs\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.772229 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-utilities\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.772270 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-catalog-content\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.772985 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-utilities\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.773000 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-catalog-content\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.796475 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q46cs\" (UniqueName: \"kubernetes.io/projected/1059cf31-4768-4ae6-8f03-d9c1fa125717-kube-api-access-q46cs\") pod \"redhat-marketplace-7kjsn\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:56 crc kubenswrapper[4750]: I1008 19:45:56.823472 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:45:57 crc kubenswrapper[4750]: I1008 19:45:57.344196 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kjsn"] Oct 08 19:45:57 crc kubenswrapper[4750]: I1008 19:45:57.376254 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kjsn" event={"ID":"1059cf31-4768-4ae6-8f03-d9c1fa125717","Type":"ContainerStarted","Data":"63e3d742289e7e4643f7a036f909fb407a5b0f365acb4222103f582e801dcab8"} Oct 08 19:45:58 crc kubenswrapper[4750]: I1008 19:45:58.392820 4750 generic.go:334] "Generic (PLEG): container finished" podID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerID="69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a" exitCode=0 Oct 08 19:45:58 crc kubenswrapper[4750]: I1008 19:45:58.393009 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kjsn" event={"ID":"1059cf31-4768-4ae6-8f03-d9c1fa125717","Type":"ContainerDied","Data":"69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a"} Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.008639 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9flkn"] Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.009951 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9flkn" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.025936 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9flkn"] Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.115713 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gxx8b"] Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.117348 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gxx8b" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.123221 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxvs\" (UniqueName: \"kubernetes.io/projected/7a5bf2cb-af64-48d8-bad7-1f32371f7f55-kube-api-access-gdxvs\") pod \"nova-api-db-create-9flkn\" (UID: \"7a5bf2cb-af64-48d8-bad7-1f32371f7f55\") " pod="openstack/nova-api-db-create-9flkn" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.136308 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gxx8b"] Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.225649 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfg87\" (UniqueName: \"kubernetes.io/projected/13f76397-084b-4a8b-ba05-8e8f819fcc7f-kube-api-access-rfg87\") pod \"nova-cell0-db-create-gxx8b\" (UID: \"13f76397-084b-4a8b-ba05-8e8f819fcc7f\") " pod="openstack/nova-cell0-db-create-gxx8b" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.226177 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxvs\" (UniqueName: \"kubernetes.io/projected/7a5bf2cb-af64-48d8-bad7-1f32371f7f55-kube-api-access-gdxvs\") pod \"nova-api-db-create-9flkn\" (UID: \"7a5bf2cb-af64-48d8-bad7-1f32371f7f55\") " pod="openstack/nova-api-db-create-9flkn" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.252249 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxvs\" (UniqueName: \"kubernetes.io/projected/7a5bf2cb-af64-48d8-bad7-1f32371f7f55-kube-api-access-gdxvs\") pod \"nova-api-db-create-9flkn\" (UID: \"7a5bf2cb-af64-48d8-bad7-1f32371f7f55\") " pod="openstack/nova-api-db-create-9flkn" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.330887 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfg87\" (UniqueName: \"kubernetes.io/projected/13f76397-084b-4a8b-ba05-8e8f819fcc7f-kube-api-access-rfg87\") pod \"nova-cell0-db-create-gxx8b\" (UID: \"13f76397-084b-4a8b-ba05-8e8f819fcc7f\") " pod="openstack/nova-cell0-db-create-gxx8b" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.338716 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9flkn" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.347097 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-sk9r7"] Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.349211 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sk9r7" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.365685 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sk9r7"] Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.372297 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfg87\" (UniqueName: \"kubernetes.io/projected/13f76397-084b-4a8b-ba05-8e8f819fcc7f-kube-api-access-rfg87\") pod \"nova-cell0-db-create-gxx8b\" (UID: \"13f76397-084b-4a8b-ba05-8e8f819fcc7f\") " pod="openstack/nova-cell0-db-create-gxx8b" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.434413 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9778b\" (UniqueName: \"kubernetes.io/projected/d8288f7a-ec5f-40eb-bff9-25d97137f1b1-kube-api-access-9778b\") pod \"nova-cell1-db-create-sk9r7\" (UID: \"d8288f7a-ec5f-40eb-bff9-25d97137f1b1\") " pod="openstack/nova-cell1-db-create-sk9r7" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.443839 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gxx8b" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.536424 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9778b\" (UniqueName: \"kubernetes.io/projected/d8288f7a-ec5f-40eb-bff9-25d97137f1b1-kube-api-access-9778b\") pod \"nova-cell1-db-create-sk9r7\" (UID: \"d8288f7a-ec5f-40eb-bff9-25d97137f1b1\") " pod="openstack/nova-cell1-db-create-sk9r7" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.559407 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9778b\" (UniqueName: \"kubernetes.io/projected/d8288f7a-ec5f-40eb-bff9-25d97137f1b1-kube-api-access-9778b\") pod \"nova-cell1-db-create-sk9r7\" (UID: \"d8288f7a-ec5f-40eb-bff9-25d97137f1b1\") " pod="openstack/nova-cell1-db-create-sk9r7" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.833878 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sk9r7" Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.903685 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9flkn"] Oct 08 19:45:59 crc kubenswrapper[4750]: W1008 19:45:59.918236 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a5bf2cb_af64_48d8_bad7_1f32371f7f55.slice/crio-7c5625a6de33583cd7d78a22e88e08c4674ea175b2897284bf4db9c4439c54ab WatchSource:0}: Error finding container 7c5625a6de33583cd7d78a22e88e08c4674ea175b2897284bf4db9c4439c54ab: Status 404 returned error can't find the container with id 7c5625a6de33583cd7d78a22e88e08c4674ea175b2897284bf4db9c4439c54ab Oct 08 19:45:59 crc kubenswrapper[4750]: I1008 19:45:59.974088 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gxx8b"] Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.103633 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sk9r7"] Oct 08 19:46:00 crc kubenswrapper[4750]: W1008 19:46:00.114619 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8288f7a_ec5f_40eb_bff9_25d97137f1b1.slice/crio-c3a80034080c27f3ef65278703ea56b896e99f78e9d866cf3c291125939d7150 WatchSource:0}: Error finding container c3a80034080c27f3ef65278703ea56b896e99f78e9d866cf3c291125939d7150: Status 404 returned error can't find the container with id c3a80034080c27f3ef65278703ea56b896e99f78e9d866cf3c291125939d7150 Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.425325 4750 generic.go:334] "Generic (PLEG): container finished" podID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerID="3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d" exitCode=0 Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.425415 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kjsn" event={"ID":"1059cf31-4768-4ae6-8f03-d9c1fa125717","Type":"ContainerDied","Data":"3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d"} Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.429259 4750 generic.go:334] "Generic (PLEG): container finished" podID="13f76397-084b-4a8b-ba05-8e8f819fcc7f" containerID="d54612685bf264478b11d6f2878fbde935d91a8b0ce838303bce27b8c6261a15" exitCode=0 Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.429465 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gxx8b" event={"ID":"13f76397-084b-4a8b-ba05-8e8f819fcc7f","Type":"ContainerDied","Data":"d54612685bf264478b11d6f2878fbde935d91a8b0ce838303bce27b8c6261a15"} Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.429498 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gxx8b" event={"ID":"13f76397-084b-4a8b-ba05-8e8f819fcc7f","Type":"ContainerStarted","Data":"de5c44d95ec3da61ce799b84065ed632676508733424236fbc69f43ff0cae0f3"} Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.433091 4750 generic.go:334] "Generic (PLEG): container finished" podID="d8288f7a-ec5f-40eb-bff9-25d97137f1b1" containerID="d366321564a926e3aad97d53b52ac89757b5bae7dedda73a052135c8db69116c" exitCode=0 Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.433175 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sk9r7" event={"ID":"d8288f7a-ec5f-40eb-bff9-25d97137f1b1","Type":"ContainerDied","Data":"d366321564a926e3aad97d53b52ac89757b5bae7dedda73a052135c8db69116c"} Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.433200 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sk9r7" event={"ID":"d8288f7a-ec5f-40eb-bff9-25d97137f1b1","Type":"ContainerStarted","Data":"c3a80034080c27f3ef65278703ea56b896e99f78e9d866cf3c291125939d7150"} Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.436438 4750 generic.go:334] "Generic (PLEG): container finished" podID="7a5bf2cb-af64-48d8-bad7-1f32371f7f55" containerID="7cc9359ec838553026bf6121708fa2ce6467821a82ed8550c0f6d28d4bd0ae14" exitCode=0 Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.436524 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9flkn" event={"ID":"7a5bf2cb-af64-48d8-bad7-1f32371f7f55","Type":"ContainerDied","Data":"7cc9359ec838553026bf6121708fa2ce6467821a82ed8550c0f6d28d4bd0ae14"} Oct 08 19:46:00 crc kubenswrapper[4750]: I1008 19:46:00.436584 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9flkn" event={"ID":"7a5bf2cb-af64-48d8-bad7-1f32371f7f55","Type":"ContainerStarted","Data":"7c5625a6de33583cd7d78a22e88e08c4674ea175b2897284bf4db9c4439c54ab"} Oct 08 19:46:01 crc kubenswrapper[4750]: I1008 19:46:01.453295 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kjsn" event={"ID":"1059cf31-4768-4ae6-8f03-d9c1fa125717","Type":"ContainerStarted","Data":"d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094"} Oct 08 19:46:01 crc kubenswrapper[4750]: I1008 19:46:01.484264 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7kjsn" podStartSLOduration=3.039181495 podStartE2EDuration="5.484230469s" podCreationTimestamp="2025-10-08 19:45:56 +0000 UTC" firstStartedPulling="2025-10-08 19:45:58.395030472 +0000 UTC m=+5714.308001475" lastFinishedPulling="2025-10-08 19:46:00.840079436 +0000 UTC m=+5716.753050449" observedRunningTime="2025-10-08 19:46:01.480908577 +0000 UTC m=+5717.393879620" watchObservedRunningTime="2025-10-08 19:46:01.484230469 +0000 UTC m=+5717.397201522" Oct 08 19:46:01 crc kubenswrapper[4750]: I1008 19:46:01.922371 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gxx8b" Oct 08 19:46:01 crc kubenswrapper[4750]: I1008 19:46:01.929200 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9flkn" Oct 08 19:46:01 crc kubenswrapper[4750]: I1008 19:46:01.935990 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sk9r7" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.009161 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdxvs\" (UniqueName: \"kubernetes.io/projected/7a5bf2cb-af64-48d8-bad7-1f32371f7f55-kube-api-access-gdxvs\") pod \"7a5bf2cb-af64-48d8-bad7-1f32371f7f55\" (UID: \"7a5bf2cb-af64-48d8-bad7-1f32371f7f55\") " Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.009571 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfg87\" (UniqueName: \"kubernetes.io/projected/13f76397-084b-4a8b-ba05-8e8f819fcc7f-kube-api-access-rfg87\") pod \"13f76397-084b-4a8b-ba05-8e8f819fcc7f\" (UID: \"13f76397-084b-4a8b-ba05-8e8f819fcc7f\") " Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.009611 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9778b\" (UniqueName: \"kubernetes.io/projected/d8288f7a-ec5f-40eb-bff9-25d97137f1b1-kube-api-access-9778b\") pod \"d8288f7a-ec5f-40eb-bff9-25d97137f1b1\" (UID: \"d8288f7a-ec5f-40eb-bff9-25d97137f1b1\") " Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.017769 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5bf2cb-af64-48d8-bad7-1f32371f7f55-kube-api-access-gdxvs" (OuterVolumeSpecName: "kube-api-access-gdxvs") pod "7a5bf2cb-af64-48d8-bad7-1f32371f7f55" (UID: "7a5bf2cb-af64-48d8-bad7-1f32371f7f55"). InnerVolumeSpecName "kube-api-access-gdxvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.017887 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8288f7a-ec5f-40eb-bff9-25d97137f1b1-kube-api-access-9778b" (OuterVolumeSpecName: "kube-api-access-9778b") pod "d8288f7a-ec5f-40eb-bff9-25d97137f1b1" (UID: "d8288f7a-ec5f-40eb-bff9-25d97137f1b1"). InnerVolumeSpecName "kube-api-access-9778b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.017961 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f76397-084b-4a8b-ba05-8e8f819fcc7f-kube-api-access-rfg87" (OuterVolumeSpecName: "kube-api-access-rfg87") pod "13f76397-084b-4a8b-ba05-8e8f819fcc7f" (UID: "13f76397-084b-4a8b-ba05-8e8f819fcc7f"). InnerVolumeSpecName "kube-api-access-rfg87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.111515 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfg87\" (UniqueName: \"kubernetes.io/projected/13f76397-084b-4a8b-ba05-8e8f819fcc7f-kube-api-access-rfg87\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.111946 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9778b\" (UniqueName: \"kubernetes.io/projected/d8288f7a-ec5f-40eb-bff9-25d97137f1b1-kube-api-access-9778b\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.111966 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdxvs\" (UniqueName: \"kubernetes.io/projected/7a5bf2cb-af64-48d8-bad7-1f32371f7f55-kube-api-access-gdxvs\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.467262 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sk9r7" event={"ID":"d8288f7a-ec5f-40eb-bff9-25d97137f1b1","Type":"ContainerDied","Data":"c3a80034080c27f3ef65278703ea56b896e99f78e9d866cf3c291125939d7150"} Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.467327 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3a80034080c27f3ef65278703ea56b896e99f78e9d866cf3c291125939d7150" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.467298 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sk9r7" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.470792 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9flkn" event={"ID":"7a5bf2cb-af64-48d8-bad7-1f32371f7f55","Type":"ContainerDied","Data":"7c5625a6de33583cd7d78a22e88e08c4674ea175b2897284bf4db9c4439c54ab"} Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.470816 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9flkn" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.470841 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c5625a6de33583cd7d78a22e88e08c4674ea175b2897284bf4db9c4439c54ab" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.473343 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gxx8b" event={"ID":"13f76397-084b-4a8b-ba05-8e8f819fcc7f","Type":"ContainerDied","Data":"de5c44d95ec3da61ce799b84065ed632676508733424236fbc69f43ff0cae0f3"} Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.473658 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de5c44d95ec3da61ce799b84065ed632676508733424236fbc69f43ff0cae0f3" Oct 08 19:46:02 crc kubenswrapper[4750]: I1008 19:46:02.473391 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gxx8b" Oct 08 19:46:03 crc kubenswrapper[4750]: I1008 19:46:03.876162 4750 scope.go:117] "RemoveContainer" containerID="30072bff486fa535b27f17325f43675ac18fd6c5ae3424ff79cec63176f583d0" Oct 08 19:46:03 crc kubenswrapper[4750]: I1008 19:46:03.918637 4750 scope.go:117] "RemoveContainer" containerID="53fc4a2a8d58658dfdfbc5b400f01bbbe6f45ba1c383a52dc5a6b732549c1561" Oct 08 19:46:06 crc kubenswrapper[4750]: I1008 19:46:06.824860 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:46:06 crc kubenswrapper[4750]: I1008 19:46:06.825400 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:46:06 crc kubenswrapper[4750]: I1008 19:46:06.902288 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:46:07 crc kubenswrapper[4750]: I1008 19:46:07.616126 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:46:07 crc kubenswrapper[4750]: I1008 19:46:07.679842 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kjsn"] Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.283918 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d413-account-create-4crz5"] Oct 08 19:46:09 crc kubenswrapper[4750]: E1008 19:46:09.285104 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f76397-084b-4a8b-ba05-8e8f819fcc7f" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.285126 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f76397-084b-4a8b-ba05-8e8f819fcc7f" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: E1008 19:46:09.285150 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8288f7a-ec5f-40eb-bff9-25d97137f1b1" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.285159 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8288f7a-ec5f-40eb-bff9-25d97137f1b1" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: E1008 19:46:09.285197 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5bf2cb-af64-48d8-bad7-1f32371f7f55" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.285208 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5bf2cb-af64-48d8-bad7-1f32371f7f55" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.285481 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8288f7a-ec5f-40eb-bff9-25d97137f1b1" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.285500 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f76397-084b-4a8b-ba05-8e8f819fcc7f" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.285522 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5bf2cb-af64-48d8-bad7-1f32371f7f55" containerName="mariadb-database-create" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.286393 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d413-account-create-4crz5" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.288829 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.304702 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d413-account-create-4crz5"] Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.394142 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5n8c\" (UniqueName: \"kubernetes.io/projected/8e7843c8-fa26-46fb-a193-ee70486938ce-kube-api-access-d5n8c\") pod \"nova-api-d413-account-create-4crz5\" (UID: \"8e7843c8-fa26-46fb-a193-ee70486938ce\") " pod="openstack/nova-api-d413-account-create-4crz5" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.482715 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4e16-account-create-t7vrg"] Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.484153 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e16-account-create-t7vrg" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.486777 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.490952 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4e16-account-create-t7vrg"] Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.496474 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5n8c\" (UniqueName: \"kubernetes.io/projected/8e7843c8-fa26-46fb-a193-ee70486938ce-kube-api-access-d5n8c\") pod \"nova-api-d413-account-create-4crz5\" (UID: \"8e7843c8-fa26-46fb-a193-ee70486938ce\") " pod="openstack/nova-api-d413-account-create-4crz5" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.520634 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5n8c\" (UniqueName: \"kubernetes.io/projected/8e7843c8-fa26-46fb-a193-ee70486938ce-kube-api-access-d5n8c\") pod \"nova-api-d413-account-create-4crz5\" (UID: \"8e7843c8-fa26-46fb-a193-ee70486938ce\") " pod="openstack/nova-api-d413-account-create-4crz5" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.568652 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7kjsn" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerName="registry-server" containerID="cri-o://d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094" gracePeriod=2 Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.598275 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh4nl\" (UniqueName: \"kubernetes.io/projected/36644d4b-9c0f-4d40-bea3-eab9bab01579-kube-api-access-xh4nl\") pod \"nova-cell0-4e16-account-create-t7vrg\" (UID: \"36644d4b-9c0f-4d40-bea3-eab9bab01579\") " pod="openstack/nova-cell0-4e16-account-create-t7vrg" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.613280 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d413-account-create-4crz5" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.676620 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9f8d-account-create-vlx88"] Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.678272 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f8d-account-create-vlx88" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.680959 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.689447 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9f8d-account-create-vlx88"] Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.701423 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh4nl\" (UniqueName: \"kubernetes.io/projected/36644d4b-9c0f-4d40-bea3-eab9bab01579-kube-api-access-xh4nl\") pod \"nova-cell0-4e16-account-create-t7vrg\" (UID: \"36644d4b-9c0f-4d40-bea3-eab9bab01579\") " pod="openstack/nova-cell0-4e16-account-create-t7vrg" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.724492 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh4nl\" (UniqueName: \"kubernetes.io/projected/36644d4b-9c0f-4d40-bea3-eab9bab01579-kube-api-access-xh4nl\") pod \"nova-cell0-4e16-account-create-t7vrg\" (UID: \"36644d4b-9c0f-4d40-bea3-eab9bab01579\") " pod="openstack/nova-cell0-4e16-account-create-t7vrg" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.803916 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r68b\" (UniqueName: \"kubernetes.io/projected/82614568-a654-433f-8008-026d4e9b1951-kube-api-access-9r68b\") pod \"nova-cell1-9f8d-account-create-vlx88\" (UID: \"82614568-a654-433f-8008-026d4e9b1951\") " pod="openstack/nova-cell1-9f8d-account-create-vlx88" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.810871 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e16-account-create-t7vrg" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.907456 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r68b\" (UniqueName: \"kubernetes.io/projected/82614568-a654-433f-8008-026d4e9b1951-kube-api-access-9r68b\") pod \"nova-cell1-9f8d-account-create-vlx88\" (UID: \"82614568-a654-433f-8008-026d4e9b1951\") " pod="openstack/nova-cell1-9f8d-account-create-vlx88" Oct 08 19:46:09 crc kubenswrapper[4750]: I1008 19:46:09.928038 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r68b\" (UniqueName: \"kubernetes.io/projected/82614568-a654-433f-8008-026d4e9b1951-kube-api-access-9r68b\") pod \"nova-cell1-9f8d-account-create-vlx88\" (UID: \"82614568-a654-433f-8008-026d4e9b1951\") " pod="openstack/nova-cell1-9f8d-account-create-vlx88" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.030862 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.119080 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f8d-account-create-vlx88" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.175179 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d413-account-create-4crz5"] Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.216028 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-catalog-content\") pod \"1059cf31-4768-4ae6-8f03-d9c1fa125717\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.216194 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-utilities\") pod \"1059cf31-4768-4ae6-8f03-d9c1fa125717\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.216249 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q46cs\" (UniqueName: \"kubernetes.io/projected/1059cf31-4768-4ae6-8f03-d9c1fa125717-kube-api-access-q46cs\") pod \"1059cf31-4768-4ae6-8f03-d9c1fa125717\" (UID: \"1059cf31-4768-4ae6-8f03-d9c1fa125717\") " Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.262431 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-utilities" (OuterVolumeSpecName: "utilities") pod "1059cf31-4768-4ae6-8f03-d9c1fa125717" (UID: "1059cf31-4768-4ae6-8f03-d9c1fa125717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.273713 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1059cf31-4768-4ae6-8f03-d9c1fa125717-kube-api-access-q46cs" (OuterVolumeSpecName: "kube-api-access-q46cs") pod "1059cf31-4768-4ae6-8f03-d9c1fa125717" (UID: "1059cf31-4768-4ae6-8f03-d9c1fa125717"). InnerVolumeSpecName "kube-api-access-q46cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.279351 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1059cf31-4768-4ae6-8f03-d9c1fa125717" (UID: "1059cf31-4768-4ae6-8f03-d9c1fa125717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.313814 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4e16-account-create-t7vrg"] Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.318975 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.319023 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q46cs\" (UniqueName: \"kubernetes.io/projected/1059cf31-4768-4ae6-8f03-d9c1fa125717-kube-api-access-q46cs\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.319039 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1059cf31-4768-4ae6-8f03-d9c1fa125717-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:10 crc kubenswrapper[4750]: W1008 19:46:10.325946 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36644d4b_9c0f_4d40_bea3_eab9bab01579.slice/crio-ebb662003c7aef579c3c92b105d9be7159159cc5527a872375dbffe95cdeae3a WatchSource:0}: Error finding container ebb662003c7aef579c3c92b105d9be7159159cc5527a872375dbffe95cdeae3a: Status 404 returned error can't find the container with id ebb662003c7aef579c3c92b105d9be7159159cc5527a872375dbffe95cdeae3a Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.585087 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e16-account-create-t7vrg" event={"ID":"36644d4b-9c0f-4d40-bea3-eab9bab01579","Type":"ContainerStarted","Data":"ebb662003c7aef579c3c92b105d9be7159159cc5527a872375dbffe95cdeae3a"} Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.588762 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d413-account-create-4crz5" event={"ID":"8e7843c8-fa26-46fb-a193-ee70486938ce","Type":"ContainerStarted","Data":"f1712fc41341bb794bce25a8677c4ac1279e0de48eaf686ee882e35d55762642"} Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.590683 4750 generic.go:334] "Generic (PLEG): container finished" podID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerID="d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094" exitCode=0 Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.590726 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kjsn" event={"ID":"1059cf31-4768-4ae6-8f03-d9c1fa125717","Type":"ContainerDied","Data":"d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094"} Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.590761 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7kjsn" event={"ID":"1059cf31-4768-4ae6-8f03-d9c1fa125717","Type":"ContainerDied","Data":"63e3d742289e7e4643f7a036f909fb407a5b0f365acb4222103f582e801dcab8"} Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.590781 4750 scope.go:117] "RemoveContainer" containerID="d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.590989 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7kjsn" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.615239 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9f8d-account-create-vlx88"] Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.625518 4750 scope.go:117] "RemoveContainer" containerID="3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.633315 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kjsn"] Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.643245 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7kjsn"] Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.688415 4750 scope.go:117] "RemoveContainer" containerID="69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.747827 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" path="/var/lib/kubelet/pods/1059cf31-4768-4ae6-8f03-d9c1fa125717/volumes" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.757655 4750 scope.go:117] "RemoveContainer" containerID="d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094" Oct 08 19:46:10 crc kubenswrapper[4750]: E1008 19:46:10.758334 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094\": container with ID starting with d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094 not found: ID does not exist" containerID="d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.758384 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094"} err="failed to get container status \"d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094\": rpc error: code = NotFound desc = could not find container \"d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094\": container with ID starting with d7aab40c9852758c89cfee20502eaa47218aec77471cc32a9636372ea0c13094 not found: ID does not exist" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.758416 4750 scope.go:117] "RemoveContainer" containerID="3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d" Oct 08 19:46:10 crc kubenswrapper[4750]: E1008 19:46:10.758917 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d\": container with ID starting with 3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d not found: ID does not exist" containerID="3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.758969 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d"} err="failed to get container status \"3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d\": rpc error: code = NotFound desc = could not find container \"3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d\": container with ID starting with 3b6b48d84723c710ff9a09b4c4ff301828edb97b4f9a9bc542fbd7fbc671a15d not found: ID does not exist" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.758995 4750 scope.go:117] "RemoveContainer" containerID="69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a" Oct 08 19:46:10 crc kubenswrapper[4750]: E1008 19:46:10.759688 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a\": container with ID starting with 69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a not found: ID does not exist" containerID="69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a" Oct 08 19:46:10 crc kubenswrapper[4750]: I1008 19:46:10.759718 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a"} err="failed to get container status \"69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a\": rpc error: code = NotFound desc = could not find container \"69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a\": container with ID starting with 69274abc32b44ce5be2ac485909b911f2e6578bbfa3cc973c037f8dcbe74087a not found: ID does not exist" Oct 08 19:46:11 crc kubenswrapper[4750]: I1008 19:46:11.603288 4750 generic.go:334] "Generic (PLEG): container finished" podID="82614568-a654-433f-8008-026d4e9b1951" containerID="a4e75abf5fbb6ba61cf0d3963a1254410baab6f8973ab1bd60d544fc60704364" exitCode=0 Oct 08 19:46:11 crc kubenswrapper[4750]: I1008 19:46:11.603391 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f8d-account-create-vlx88" event={"ID":"82614568-a654-433f-8008-026d4e9b1951","Type":"ContainerDied","Data":"a4e75abf5fbb6ba61cf0d3963a1254410baab6f8973ab1bd60d544fc60704364"} Oct 08 19:46:11 crc kubenswrapper[4750]: I1008 19:46:11.603909 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f8d-account-create-vlx88" event={"ID":"82614568-a654-433f-8008-026d4e9b1951","Type":"ContainerStarted","Data":"6bf8eb50f5c223e6fdd12151cfa01db55bc1236bf047f70843d13ce8d1f1e639"} Oct 08 19:46:11 crc kubenswrapper[4750]: I1008 19:46:11.606595 4750 generic.go:334] "Generic (PLEG): container finished" podID="8e7843c8-fa26-46fb-a193-ee70486938ce" containerID="b53cbbe43cd6f2ad27daa9736fa749a6da798ee14914e6fc0fa0fbd56a62bcba" exitCode=0 Oct 08 19:46:11 crc kubenswrapper[4750]: I1008 19:46:11.606748 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d413-account-create-4crz5" event={"ID":"8e7843c8-fa26-46fb-a193-ee70486938ce","Type":"ContainerDied","Data":"b53cbbe43cd6f2ad27daa9736fa749a6da798ee14914e6fc0fa0fbd56a62bcba"} Oct 08 19:46:11 crc kubenswrapper[4750]: I1008 19:46:11.610931 4750 generic.go:334] "Generic (PLEG): container finished" podID="36644d4b-9c0f-4d40-bea3-eab9bab01579" containerID="072e06fd3a3dd64f96031b2b0b6bfffe55ecca2ef79371dcad9ce43cd3e4c581" exitCode=0 Oct 08 19:46:11 crc kubenswrapper[4750]: I1008 19:46:11.610986 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e16-account-create-t7vrg" event={"ID":"36644d4b-9c0f-4d40-bea3-eab9bab01579","Type":"ContainerDied","Data":"072e06fd3a3dd64f96031b2b0b6bfffe55ecca2ef79371dcad9ce43cd3e4c581"} Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.103683 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f8d-account-create-vlx88" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.114718 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d413-account-create-4crz5" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.129721 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e16-account-create-t7vrg" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.178720 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5n8c\" (UniqueName: \"kubernetes.io/projected/8e7843c8-fa26-46fb-a193-ee70486938ce-kube-api-access-d5n8c\") pod \"8e7843c8-fa26-46fb-a193-ee70486938ce\" (UID: \"8e7843c8-fa26-46fb-a193-ee70486938ce\") " Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.178829 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r68b\" (UniqueName: \"kubernetes.io/projected/82614568-a654-433f-8008-026d4e9b1951-kube-api-access-9r68b\") pod \"82614568-a654-433f-8008-026d4e9b1951\" (UID: \"82614568-a654-433f-8008-026d4e9b1951\") " Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.179010 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh4nl\" (UniqueName: \"kubernetes.io/projected/36644d4b-9c0f-4d40-bea3-eab9bab01579-kube-api-access-xh4nl\") pod \"36644d4b-9c0f-4d40-bea3-eab9bab01579\" (UID: \"36644d4b-9c0f-4d40-bea3-eab9bab01579\") " Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.185248 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e7843c8-fa26-46fb-a193-ee70486938ce-kube-api-access-d5n8c" (OuterVolumeSpecName: "kube-api-access-d5n8c") pod "8e7843c8-fa26-46fb-a193-ee70486938ce" (UID: "8e7843c8-fa26-46fb-a193-ee70486938ce"). InnerVolumeSpecName "kube-api-access-d5n8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.190648 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36644d4b-9c0f-4d40-bea3-eab9bab01579-kube-api-access-xh4nl" (OuterVolumeSpecName: "kube-api-access-xh4nl") pod "36644d4b-9c0f-4d40-bea3-eab9bab01579" (UID: "36644d4b-9c0f-4d40-bea3-eab9bab01579"). InnerVolumeSpecName "kube-api-access-xh4nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.190676 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82614568-a654-433f-8008-026d4e9b1951-kube-api-access-9r68b" (OuterVolumeSpecName: "kube-api-access-9r68b") pod "82614568-a654-433f-8008-026d4e9b1951" (UID: "82614568-a654-433f-8008-026d4e9b1951"). InnerVolumeSpecName "kube-api-access-9r68b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.281062 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5n8c\" (UniqueName: \"kubernetes.io/projected/8e7843c8-fa26-46fb-a193-ee70486938ce-kube-api-access-d5n8c\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.281114 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r68b\" (UniqueName: \"kubernetes.io/projected/82614568-a654-433f-8008-026d4e9b1951-kube-api-access-9r68b\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.281131 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh4nl\" (UniqueName: \"kubernetes.io/projected/36644d4b-9c0f-4d40-bea3-eab9bab01579-kube-api-access-xh4nl\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.639635 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f8d-account-create-vlx88" event={"ID":"82614568-a654-433f-8008-026d4e9b1951","Type":"ContainerDied","Data":"6bf8eb50f5c223e6fdd12151cfa01db55bc1236bf047f70843d13ce8d1f1e639"} Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.639686 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bf8eb50f5c223e6fdd12151cfa01db55bc1236bf047f70843d13ce8d1f1e639" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.639647 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f8d-account-create-vlx88" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.641844 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d413-account-create-4crz5" event={"ID":"8e7843c8-fa26-46fb-a193-ee70486938ce","Type":"ContainerDied","Data":"f1712fc41341bb794bce25a8677c4ac1279e0de48eaf686ee882e35d55762642"} Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.641896 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1712fc41341bb794bce25a8677c4ac1279e0de48eaf686ee882e35d55762642" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.641936 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d413-account-create-4crz5" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.644940 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e16-account-create-t7vrg" event={"ID":"36644d4b-9c0f-4d40-bea3-eab9bab01579","Type":"ContainerDied","Data":"ebb662003c7aef579c3c92b105d9be7159159cc5527a872375dbffe95cdeae3a"} Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.644982 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebb662003c7aef579c3c92b105d9be7159159cc5527a872375dbffe95cdeae3a" Oct 08 19:46:13 crc kubenswrapper[4750]: I1008 19:46:13.644990 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e16-account-create-t7vrg" Oct 08 19:46:13 crc kubenswrapper[4750]: E1008 19:46:13.816429 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82614568_a654_433f_8008_026d4e9b1951.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36644d4b_9c0f_4d40_bea3_eab9bab01579.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e7843c8_fa26_46fb_a193_ee70486938ce.slice/crio-f1712fc41341bb794bce25a8677c4ac1279e0de48eaf686ee882e35d55762642\": RecentStats: unable to find data in memory cache]" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.708740 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdkkb"] Oct 08 19:46:14 crc kubenswrapper[4750]: E1008 19:46:14.709665 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerName="extract-utilities" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.709683 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerName="extract-utilities" Oct 08 19:46:14 crc kubenswrapper[4750]: E1008 19:46:14.709721 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerName="extract-content" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.709730 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerName="extract-content" Oct 08 19:46:14 crc kubenswrapper[4750]: E1008 19:46:14.709743 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36644d4b-9c0f-4d40-bea3-eab9bab01579" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.709752 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="36644d4b-9c0f-4d40-bea3-eab9bab01579" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: E1008 19:46:14.709763 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerName="registry-server" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.709769 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerName="registry-server" Oct 08 19:46:14 crc kubenswrapper[4750]: E1008 19:46:14.709778 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82614568-a654-433f-8008-026d4e9b1951" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.709784 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="82614568-a654-433f-8008-026d4e9b1951" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: E1008 19:46:14.709798 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7843c8-fa26-46fb-a193-ee70486938ce" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.709804 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7843c8-fa26-46fb-a193-ee70486938ce" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.710135 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1059cf31-4768-4ae6-8f03-d9c1fa125717" containerName="registry-server" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.710148 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="82614568-a654-433f-8008-026d4e9b1951" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.710166 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7843c8-fa26-46fb-a193-ee70486938ce" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.710181 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="36644d4b-9c0f-4d40-bea3-eab9bab01579" containerName="mariadb-account-create" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.711058 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.714598 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.714663 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.714692 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6qk7m" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.753756 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdkkb"] Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.812336 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xktx8\" (UniqueName: \"kubernetes.io/projected/6024d989-90d4-44cf-bb56-539b3926d6a6-kube-api-access-xktx8\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.812475 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-config-data\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.812588 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-scripts\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.812658 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.914766 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xktx8\" (UniqueName: \"kubernetes.io/projected/6024d989-90d4-44cf-bb56-539b3926d6a6-kube-api-access-xktx8\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.914852 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-config-data\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.914901 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-scripts\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.914932 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.920758 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-scripts\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.921122 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-config-data\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.921773 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:14 crc kubenswrapper[4750]: I1008 19:46:14.933166 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xktx8\" (UniqueName: \"kubernetes.io/projected/6024d989-90d4-44cf-bb56-539b3926d6a6-kube-api-access-xktx8\") pod \"nova-cell0-conductor-db-sync-cdkkb\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:15 crc kubenswrapper[4750]: I1008 19:46:15.035281 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:15 crc kubenswrapper[4750]: I1008 19:46:15.520274 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdkkb"] Oct 08 19:46:15 crc kubenswrapper[4750]: I1008 19:46:15.668822 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" event={"ID":"6024d989-90d4-44cf-bb56-539b3926d6a6","Type":"ContainerStarted","Data":"2c5461069339d6b33d6a789e901e9674d0646c8ebafc87ec304d5de8f6f73cb5"} Oct 08 19:46:16 crc kubenswrapper[4750]: I1008 19:46:16.685625 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" event={"ID":"6024d989-90d4-44cf-bb56-539b3926d6a6","Type":"ContainerStarted","Data":"3d0c56b5c34bf95f116aad313597dacc8d4dffe0962da310d7a7827c5a69cd6d"} Oct 08 19:46:16 crc kubenswrapper[4750]: I1008 19:46:16.721124 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" podStartSLOduration=2.721088182 podStartE2EDuration="2.721088182s" podCreationTimestamp="2025-10-08 19:46:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:16.712214111 +0000 UTC m=+5732.625185164" watchObservedRunningTime="2025-10-08 19:46:16.721088182 +0000 UTC m=+5732.634059275" Oct 08 19:46:21 crc kubenswrapper[4750]: I1008 19:46:21.745971 4750 generic.go:334] "Generic (PLEG): container finished" podID="6024d989-90d4-44cf-bb56-539b3926d6a6" containerID="3d0c56b5c34bf95f116aad313597dacc8d4dffe0962da310d7a7827c5a69cd6d" exitCode=0 Oct 08 19:46:21 crc kubenswrapper[4750]: I1008 19:46:21.746113 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" event={"ID":"6024d989-90d4-44cf-bb56-539b3926d6a6","Type":"ContainerDied","Data":"3d0c56b5c34bf95f116aad313597dacc8d4dffe0962da310d7a7827c5a69cd6d"} Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.270117 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.326683 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-combined-ca-bundle\") pod \"6024d989-90d4-44cf-bb56-539b3926d6a6\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.326803 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-scripts\") pod \"6024d989-90d4-44cf-bb56-539b3926d6a6\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.326846 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-config-data\") pod \"6024d989-90d4-44cf-bb56-539b3926d6a6\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.326925 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xktx8\" (UniqueName: \"kubernetes.io/projected/6024d989-90d4-44cf-bb56-539b3926d6a6-kube-api-access-xktx8\") pod \"6024d989-90d4-44cf-bb56-539b3926d6a6\" (UID: \"6024d989-90d4-44cf-bb56-539b3926d6a6\") " Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.340853 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6024d989-90d4-44cf-bb56-539b3926d6a6-kube-api-access-xktx8" (OuterVolumeSpecName: "kube-api-access-xktx8") pod "6024d989-90d4-44cf-bb56-539b3926d6a6" (UID: "6024d989-90d4-44cf-bb56-539b3926d6a6"). InnerVolumeSpecName "kube-api-access-xktx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.340952 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-scripts" (OuterVolumeSpecName: "scripts") pod "6024d989-90d4-44cf-bb56-539b3926d6a6" (UID: "6024d989-90d4-44cf-bb56-539b3926d6a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.361430 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-config-data" (OuterVolumeSpecName: "config-data") pod "6024d989-90d4-44cf-bb56-539b3926d6a6" (UID: "6024d989-90d4-44cf-bb56-539b3926d6a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.369945 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6024d989-90d4-44cf-bb56-539b3926d6a6" (UID: "6024d989-90d4-44cf-bb56-539b3926d6a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.429490 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.429566 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.429590 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6024d989-90d4-44cf-bb56-539b3926d6a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.429609 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xktx8\" (UniqueName: \"kubernetes.io/projected/6024d989-90d4-44cf-bb56-539b3926d6a6-kube-api-access-xktx8\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.795164 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" event={"ID":"6024d989-90d4-44cf-bb56-539b3926d6a6","Type":"ContainerDied","Data":"2c5461069339d6b33d6a789e901e9674d0646c8ebafc87ec304d5de8f6f73cb5"} Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.795214 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c5461069339d6b33d6a789e901e9674d0646c8ebafc87ec304d5de8f6f73cb5" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.795322 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cdkkb" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.887816 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:46:23 crc kubenswrapper[4750]: E1008 19:46:23.888570 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6024d989-90d4-44cf-bb56-539b3926d6a6" containerName="nova-cell0-conductor-db-sync" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.888604 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="6024d989-90d4-44cf-bb56-539b3926d6a6" containerName="nova-cell0-conductor-db-sync" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.888869 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="6024d989-90d4-44cf-bb56-539b3926d6a6" containerName="nova-cell0-conductor-db-sync" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.889797 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.893284 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.893711 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6qk7m" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.913829 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.940451 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.940875 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:23 crc kubenswrapper[4750]: I1008 19:46:23.940951 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl475\" (UniqueName: \"kubernetes.io/projected/cc2df1e2-9676-4737-ba39-8769738c1c67-kube-api-access-hl475\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.042041 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.042146 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.042260 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl475\" (UniqueName: \"kubernetes.io/projected/cc2df1e2-9676-4737-ba39-8769738c1c67-kube-api-access-hl475\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.050336 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.059490 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl475\" (UniqueName: \"kubernetes.io/projected/cc2df1e2-9676-4737-ba39-8769738c1c67-kube-api-access-hl475\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.062591 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.227238 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.776404 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:46:24 crc kubenswrapper[4750]: I1008 19:46:24.807281 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cc2df1e2-9676-4737-ba39-8769738c1c67","Type":"ContainerStarted","Data":"334feecd3069e241f128e19cb978323f18947c29bac81eead1911cbd364495d1"} Oct 08 19:46:25 crc kubenswrapper[4750]: I1008 19:46:25.822747 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cc2df1e2-9676-4737-ba39-8769738c1c67","Type":"ContainerStarted","Data":"38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53"} Oct 08 19:46:25 crc kubenswrapper[4750]: I1008 19:46:25.824423 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:25 crc kubenswrapper[4750]: I1008 19:46:25.855595 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.855569656 podStartE2EDuration="2.855569656s" podCreationTimestamp="2025-10-08 19:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:25.849899915 +0000 UTC m=+5741.762870928" watchObservedRunningTime="2025-10-08 19:46:25.855569656 +0000 UTC m=+5741.768540669" Oct 08 19:46:29 crc kubenswrapper[4750]: I1008 19:46:29.266805 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 19:46:29 crc kubenswrapper[4750]: I1008 19:46:29.997599 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-x9l8h"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.001768 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.005010 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.005684 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.066361 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x9l8h"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.097906 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.098096 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-scripts\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.098245 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jhxl\" (UniqueName: \"kubernetes.io/projected/23216996-77d2-40e7-a29c-b43247b1fb15-kube-api-access-6jhxl\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.098364 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-config-data\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.125674 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.127636 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.131259 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.144105 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.200864 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.201228 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-scripts\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.201358 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jhxl\" (UniqueName: \"kubernetes.io/projected/23216996-77d2-40e7-a29c-b43247b1fb15-kube-api-access-6jhxl\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.201453 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.201539 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d0e5c35-3529-4151-8585-cd42d5b114af-logs\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.201661 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-config-data\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.201780 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-config-data\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.201860 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6r7c\" (UniqueName: \"kubernetes.io/projected/2d0e5c35-3529-4151-8585-cd42d5b114af-kube-api-access-k6r7c\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.229585 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.231158 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-scripts\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.231927 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-config-data\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.239742 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.241855 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.247747 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.248213 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jhxl\" (UniqueName: \"kubernetes.io/projected/23216996-77d2-40e7-a29c-b43247b1fb15-kube-api-access-6jhxl\") pod \"nova-cell0-cell-mapping-x9l8h\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.274845 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.304573 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.304681 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.304704 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d0e5c35-3529-4151-8585-cd42d5b114af-logs\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.304726 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d77sr\" (UniqueName: \"kubernetes.io/projected/3e13efc0-b986-4638-ac34-35f3cddc6a02-kube-api-access-d77sr\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.304785 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.304822 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-config-data\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.304845 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6r7c\" (UniqueName: \"kubernetes.io/projected/2d0e5c35-3529-4151-8585-cd42d5b114af-kube-api-access-k6r7c\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.305166 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.305605 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d0e5c35-3529-4151-8585-cd42d5b114af-logs\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.307515 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.317935 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.319900 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-config-data\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.326445 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.336847 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6r7c\" (UniqueName: \"kubernetes.io/projected/2d0e5c35-3529-4151-8585-cd42d5b114af-kube-api-access-k6r7c\") pod \"nova-api-0\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.349804 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.355716 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.402422 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.404371 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.407202 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26823f02-cb94-46e2-ac86-0d0a2ee50c11-logs\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.407296 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.407344 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d77sr\" (UniqueName: \"kubernetes.io/projected/3e13efc0-b986-4638-ac34-35f3cddc6a02-kube-api-access-d77sr\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.407414 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.407443 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.407485 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-config-data\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.407509 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjrkx\" (UniqueName: \"kubernetes.io/projected/26823f02-cb94-46e2-ac86-0d0a2ee50c11-kube-api-access-tjrkx\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.407709 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.413263 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.432253 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.440224 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.450397 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d77sr\" (UniqueName: \"kubernetes.io/projected/3e13efc0-b986-4638-ac34-35f3cddc6a02-kube-api-access-d77sr\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.456746 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.468774 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864fd79b75-qq86v"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.470429 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.509157 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864fd79b75-qq86v"] Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.514849 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26823f02-cb94-46e2-ac86-0d0a2ee50c11-logs\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.515822 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-config-data\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.515933 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2zjc\" (UniqueName: \"kubernetes.io/projected/11f2aae2-99f9-43da-b706-e000f1414558-kube-api-access-q2zjc\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516067 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k586h\" (UniqueName: \"kubernetes.io/projected/c959de61-3bdb-4787-976f-4da974423d1a-kube-api-access-k586h\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516144 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516237 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-dns-svc\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516321 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-sb\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516396 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-nb\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516500 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516611 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-config\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516722 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-config-data\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.516798 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjrkx\" (UniqueName: \"kubernetes.io/projected/26823f02-cb94-46e2-ac86-0d0a2ee50c11-kube-api-access-tjrkx\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.517735 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26823f02-cb94-46e2-ac86-0d0a2ee50c11-logs\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.523523 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.535399 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-config-data\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.547776 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjrkx\" (UniqueName: \"kubernetes.io/projected/26823f02-cb94-46e2-ac86-0d0a2ee50c11-kube-api-access-tjrkx\") pod \"nova-metadata-0\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.609275 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.619477 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-config-data\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.619523 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2zjc\" (UniqueName: \"kubernetes.io/projected/11f2aae2-99f9-43da-b706-e000f1414558-kube-api-access-q2zjc\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.619599 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k586h\" (UniqueName: \"kubernetes.io/projected/c959de61-3bdb-4787-976f-4da974423d1a-kube-api-access-k586h\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.619618 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.619649 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-dns-svc\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.619675 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-sb\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.619696 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-nb\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.619766 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-config\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.621504 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-config\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.622325 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-dns-svc\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.623131 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-sb\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.630651 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.634043 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-config-data\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.639776 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-nb\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.645685 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k586h\" (UniqueName: \"kubernetes.io/projected/c959de61-3bdb-4787-976f-4da974423d1a-kube-api-access-k586h\") pod \"nova-scheduler-0\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.656449 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2zjc\" (UniqueName: \"kubernetes.io/projected/11f2aae2-99f9-43da-b706-e000f1414558-kube-api-access-q2zjc\") pod \"dnsmasq-dns-864fd79b75-qq86v\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.771055 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.809093 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:46:30 crc kubenswrapper[4750]: I1008 19:46:30.819895 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.088257 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x9l8h"] Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.108792 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:31 crc kubenswrapper[4750]: W1008 19:46:31.153160 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d0e5c35_3529_4151_8585_cd42d5b114af.slice/crio-a26c0332905ab4d0cc107b36937c4debce9c12c753d490793c07e17da1f179eb WatchSource:0}: Error finding container a26c0332905ab4d0cc107b36937c4debce9c12c753d490793c07e17da1f179eb: Status 404 returned error can't find the container with id a26c0332905ab4d0cc107b36937c4debce9c12c753d490793c07e17da1f179eb Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.161291 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:46:31 crc kubenswrapper[4750]: W1008 19:46:31.182401 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e13efc0_b986_4638_ac34_35f3cddc6a02.slice/crio-d7752f0494ec8643887e37ee355d475081ac4a67339ef3177e01013a1c027374 WatchSource:0}: Error finding container d7752f0494ec8643887e37ee355d475081ac4a67339ef3177e01013a1c027374: Status 404 returned error can't find the container with id d7752f0494ec8643887e37ee355d475081ac4a67339ef3177e01013a1c027374 Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.633804 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.650976 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864fd79b75-qq86v"] Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.660100 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:31 crc kubenswrapper[4750]: W1008 19:46:31.668693 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc959de61_3bdb_4787_976f_4da974423d1a.slice/crio-3760b5c4940a7aa5c8453d5174fd387f4527c7fe1f62ae00b2fbb39cd9e1a35f WatchSource:0}: Error finding container 3760b5c4940a7aa5c8453d5174fd387f4527c7fe1f62ae00b2fbb39cd9e1a35f: Status 404 returned error can't find the container with id 3760b5c4940a7aa5c8453d5174fd387f4527c7fe1f62ae00b2fbb39cd9e1a35f Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.854007 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-284vj"] Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.855340 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.860078 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.860631 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.885308 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-284vj"] Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.951892 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" event={"ID":"11f2aae2-99f9-43da-b706-e000f1414558","Type":"ContainerStarted","Data":"eb2150661a24702de7e2c7a20de2fc1c6d18be853e17a514797082508ea2a286"} Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.958097 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z65xl\" (UniqueName: \"kubernetes.io/projected/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-kube-api-access-z65xl\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.958215 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-scripts\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.958253 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.958748 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-config-data\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.960349 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e13efc0-b986-4638-ac34-35f3cddc6a02","Type":"ContainerStarted","Data":"f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7"} Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.960411 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e13efc0-b986-4638-ac34-35f3cddc6a02","Type":"ContainerStarted","Data":"d7752f0494ec8643887e37ee355d475081ac4a67339ef3177e01013a1c027374"} Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.987174 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0e5c35-3529-4151-8585-cd42d5b114af","Type":"ContainerStarted","Data":"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6"} Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.987236 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0e5c35-3529-4151-8585-cd42d5b114af","Type":"ContainerStarted","Data":"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292"} Oct 08 19:46:31 crc kubenswrapper[4750]: I1008 19:46:31.987246 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0e5c35-3529-4151-8585-cd42d5b114af","Type":"ContainerStarted","Data":"a26c0332905ab4d0cc107b36937c4debce9c12c753d490793c07e17da1f179eb"} Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.000530 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.000470974 podStartE2EDuration="2.000470974s" podCreationTimestamp="2025-10-08 19:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:31.985354328 +0000 UTC m=+5747.898325341" watchObservedRunningTime="2025-10-08 19:46:32.000470974 +0000 UTC m=+5747.913441997" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.002156 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x9l8h" event={"ID":"23216996-77d2-40e7-a29c-b43247b1fb15","Type":"ContainerStarted","Data":"0f734e41bc4c2c9759590ec271334e797ef292637184bc6f1b6393a5002293fd"} Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.002239 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x9l8h" event={"ID":"23216996-77d2-40e7-a29c-b43247b1fb15","Type":"ContainerStarted","Data":"e6968771561b8666726c3d82f6ef9fe775052e7a559af6a428479a078a91b4e3"} Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.028325 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c959de61-3bdb-4787-976f-4da974423d1a","Type":"ContainerStarted","Data":"3760b5c4940a7aa5c8453d5174fd387f4527c7fe1f62ae00b2fbb39cd9e1a35f"} Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.040301 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"26823f02-cb94-46e2-ac86-0d0a2ee50c11","Type":"ContainerStarted","Data":"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e"} Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.040361 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"26823f02-cb94-46e2-ac86-0d0a2ee50c11","Type":"ContainerStarted","Data":"6757359bdcdf5779add44842e4883cc689403e158aab7d3241f9b73459e18d1a"} Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.046853 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.046827579 podStartE2EDuration="2.046827579s" podCreationTimestamp="2025-10-08 19:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:32.024473723 +0000 UTC m=+5747.937444736" watchObservedRunningTime="2025-10-08 19:46:32.046827579 +0000 UTC m=+5747.959798592" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.052067 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-x9l8h" podStartSLOduration=3.052048439 podStartE2EDuration="3.052048439s" podCreationTimestamp="2025-10-08 19:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:32.04321809 +0000 UTC m=+5747.956189113" watchObservedRunningTime="2025-10-08 19:46:32.052048439 +0000 UTC m=+5747.965019452" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.061133 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.061412 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-config-data\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.061646 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z65xl\" (UniqueName: \"kubernetes.io/projected/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-kube-api-access-z65xl\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.061837 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-scripts\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.070993 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-scripts\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.071382 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.075760 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-config-data\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.078948 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.078932499 podStartE2EDuration="2.078932499s" podCreationTimestamp="2025-10-08 19:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:32.057560557 +0000 UTC m=+5747.970531570" watchObservedRunningTime="2025-10-08 19:46:32.078932499 +0000 UTC m=+5747.991903512" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.083879 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z65xl\" (UniqueName: \"kubernetes.io/projected/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-kube-api-access-z65xl\") pod \"nova-cell1-conductor-db-sync-284vj\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.197451 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:32 crc kubenswrapper[4750]: I1008 19:46:32.681913 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-284vj"] Oct 08 19:46:33 crc kubenswrapper[4750]: I1008 19:46:33.057284 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"26823f02-cb94-46e2-ac86-0d0a2ee50c11","Type":"ContainerStarted","Data":"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1"} Oct 08 19:46:33 crc kubenswrapper[4750]: I1008 19:46:33.063642 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-284vj" event={"ID":"d4231f7e-ad8e-42f5-bbd9-b8d6df082070","Type":"ContainerStarted","Data":"b59449656c49771bb5a6ac16ab013ef0c4a8f47dad5d85c8cd08a99d59b1f166"} Oct 08 19:46:33 crc kubenswrapper[4750]: I1008 19:46:33.063699 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-284vj" event={"ID":"d4231f7e-ad8e-42f5-bbd9-b8d6df082070","Type":"ContainerStarted","Data":"02a46cf01d6cf4b37b01c2eb5d02094c3c6376c8f5770eca4f1bff071f3f6efb"} Oct 08 19:46:33 crc kubenswrapper[4750]: I1008 19:46:33.077007 4750 generic.go:334] "Generic (PLEG): container finished" podID="11f2aae2-99f9-43da-b706-e000f1414558" containerID="fb9628811df103bd14e9eeb2b06a8f6c4c1b89ed3cc5d0f159b96c35aaf66367" exitCode=0 Oct 08 19:46:33 crc kubenswrapper[4750]: I1008 19:46:33.077171 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" event={"ID":"11f2aae2-99f9-43da-b706-e000f1414558","Type":"ContainerDied","Data":"fb9628811df103bd14e9eeb2b06a8f6c4c1b89ed3cc5d0f159b96c35aaf66367"} Oct 08 19:46:33 crc kubenswrapper[4750]: I1008 19:46:33.097945 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c959de61-3bdb-4787-976f-4da974423d1a","Type":"ContainerStarted","Data":"bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13"} Oct 08 19:46:33 crc kubenswrapper[4750]: I1008 19:46:33.099928 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.099899254 podStartE2EDuration="3.099899254s" podCreationTimestamp="2025-10-08 19:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:33.090437958 +0000 UTC m=+5749.003408991" watchObservedRunningTime="2025-10-08 19:46:33.099899254 +0000 UTC m=+5749.012870267" Oct 08 19:46:33 crc kubenswrapper[4750]: I1008 19:46:33.125507 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-284vj" podStartSLOduration=2.1254813710000002 podStartE2EDuration="2.125481371s" podCreationTimestamp="2025-10-08 19:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:33.117719967 +0000 UTC m=+5749.030690990" watchObservedRunningTime="2025-10-08 19:46:33.125481371 +0000 UTC m=+5749.038452394" Oct 08 19:46:34 crc kubenswrapper[4750]: I1008 19:46:34.109302 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" event={"ID":"11f2aae2-99f9-43da-b706-e000f1414558","Type":"ContainerStarted","Data":"67d87465196007bd22f34da4339127e60e1abd2ec687ebf984f89098e900d038"} Oct 08 19:46:34 crc kubenswrapper[4750]: I1008 19:46:34.111894 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:34 crc kubenswrapper[4750]: I1008 19:46:34.142804 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" podStartSLOduration=4.142780533 podStartE2EDuration="4.142780533s" podCreationTimestamp="2025-10-08 19:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:34.137926683 +0000 UTC m=+5750.050897716" watchObservedRunningTime="2025-10-08 19:46:34.142780533 +0000 UTC m=+5750.055751546" Oct 08 19:46:35 crc kubenswrapper[4750]: I1008 19:46:35.610316 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:35 crc kubenswrapper[4750]: I1008 19:46:35.772349 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 19:46:35 crc kubenswrapper[4750]: I1008 19:46:35.772933 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 19:46:35 crc kubenswrapper[4750]: I1008 19:46:35.810765 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 19:46:36 crc kubenswrapper[4750]: I1008 19:46:36.134086 4750 generic.go:334] "Generic (PLEG): container finished" podID="d4231f7e-ad8e-42f5-bbd9-b8d6df082070" containerID="b59449656c49771bb5a6ac16ab013ef0c4a8f47dad5d85c8cd08a99d59b1f166" exitCode=0 Oct 08 19:46:36 crc kubenswrapper[4750]: I1008 19:46:36.134144 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-284vj" event={"ID":"d4231f7e-ad8e-42f5-bbd9-b8d6df082070","Type":"ContainerDied","Data":"b59449656c49771bb5a6ac16ab013ef0c4a8f47dad5d85c8cd08a99d59b1f166"} Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.149779 4750 generic.go:334] "Generic (PLEG): container finished" podID="23216996-77d2-40e7-a29c-b43247b1fb15" containerID="0f734e41bc4c2c9759590ec271334e797ef292637184bc6f1b6393a5002293fd" exitCode=0 Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.149936 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x9l8h" event={"ID":"23216996-77d2-40e7-a29c-b43247b1fb15","Type":"ContainerDied","Data":"0f734e41bc4c2c9759590ec271334e797ef292637184bc6f1b6393a5002293fd"} Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.628379 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.726767 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-scripts\") pod \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.726865 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-combined-ca-bundle\") pod \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.727078 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z65xl\" (UniqueName: \"kubernetes.io/projected/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-kube-api-access-z65xl\") pod \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.727150 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-config-data\") pod \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\" (UID: \"d4231f7e-ad8e-42f5-bbd9-b8d6df082070\") " Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.735604 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-scripts" (OuterVolumeSpecName: "scripts") pod "d4231f7e-ad8e-42f5-bbd9-b8d6df082070" (UID: "d4231f7e-ad8e-42f5-bbd9-b8d6df082070"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.736503 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-kube-api-access-z65xl" (OuterVolumeSpecName: "kube-api-access-z65xl") pod "d4231f7e-ad8e-42f5-bbd9-b8d6df082070" (UID: "d4231f7e-ad8e-42f5-bbd9-b8d6df082070"). InnerVolumeSpecName "kube-api-access-z65xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.762877 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-config-data" (OuterVolumeSpecName: "config-data") pod "d4231f7e-ad8e-42f5-bbd9-b8d6df082070" (UID: "d4231f7e-ad8e-42f5-bbd9-b8d6df082070"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.765185 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4231f7e-ad8e-42f5-bbd9-b8d6df082070" (UID: "d4231f7e-ad8e-42f5-bbd9-b8d6df082070"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.829709 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.829779 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.829794 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z65xl\" (UniqueName: \"kubernetes.io/projected/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-kube-api-access-z65xl\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:37 crc kubenswrapper[4750]: I1008 19:46:37.829806 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4231f7e-ad8e-42f5-bbd9-b8d6df082070-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.165856 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-284vj" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.166225 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-284vj" event={"ID":"d4231f7e-ad8e-42f5-bbd9-b8d6df082070","Type":"ContainerDied","Data":"02a46cf01d6cf4b37b01c2eb5d02094c3c6376c8f5770eca4f1bff071f3f6efb"} Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.166535 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02a46cf01d6cf4b37b01c2eb5d02094c3c6376c8f5770eca4f1bff071f3f6efb" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.271875 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:46:38 crc kubenswrapper[4750]: E1008 19:46:38.272570 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4231f7e-ad8e-42f5-bbd9-b8d6df082070" containerName="nova-cell1-conductor-db-sync" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.272596 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4231f7e-ad8e-42f5-bbd9-b8d6df082070" containerName="nova-cell1-conductor-db-sync" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.272900 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4231f7e-ad8e-42f5-bbd9-b8d6df082070" containerName="nova-cell1-conductor-db-sync" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.273820 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.276217 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.291189 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.342910 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.342975 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhqqn\" (UniqueName: \"kubernetes.io/projected/5a8c7c05-0b87-4059-9184-111d44c1e83b-kube-api-access-mhqqn\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.343323 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.446305 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.446380 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhqqn\" (UniqueName: \"kubernetes.io/projected/5a8c7c05-0b87-4059-9184-111d44c1e83b-kube-api-access-mhqqn\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.446481 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.456695 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.460317 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.479367 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhqqn\" (UniqueName: \"kubernetes.io/projected/5a8c7c05-0b87-4059-9184-111d44c1e83b-kube-api-access-mhqqn\") pod \"nova-cell1-conductor-0\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.592666 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.724805 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.854162 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-combined-ca-bundle\") pod \"23216996-77d2-40e7-a29c-b43247b1fb15\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.854490 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jhxl\" (UniqueName: \"kubernetes.io/projected/23216996-77d2-40e7-a29c-b43247b1fb15-kube-api-access-6jhxl\") pod \"23216996-77d2-40e7-a29c-b43247b1fb15\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.854660 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-config-data\") pod \"23216996-77d2-40e7-a29c-b43247b1fb15\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.854821 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-scripts\") pod \"23216996-77d2-40e7-a29c-b43247b1fb15\" (UID: \"23216996-77d2-40e7-a29c-b43247b1fb15\") " Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.863901 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-scripts" (OuterVolumeSpecName: "scripts") pod "23216996-77d2-40e7-a29c-b43247b1fb15" (UID: "23216996-77d2-40e7-a29c-b43247b1fb15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.878904 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23216996-77d2-40e7-a29c-b43247b1fb15-kube-api-access-6jhxl" (OuterVolumeSpecName: "kube-api-access-6jhxl") pod "23216996-77d2-40e7-a29c-b43247b1fb15" (UID: "23216996-77d2-40e7-a29c-b43247b1fb15"). InnerVolumeSpecName "kube-api-access-6jhxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.893453 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23216996-77d2-40e7-a29c-b43247b1fb15" (UID: "23216996-77d2-40e7-a29c-b43247b1fb15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.917216 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-config-data" (OuterVolumeSpecName: "config-data") pod "23216996-77d2-40e7-a29c-b43247b1fb15" (UID: "23216996-77d2-40e7-a29c-b43247b1fb15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.957856 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jhxl\" (UniqueName: \"kubernetes.io/projected/23216996-77d2-40e7-a29c-b43247b1fb15-kube-api-access-6jhxl\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.957905 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.957919 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:38 crc kubenswrapper[4750]: I1008 19:46:38.957933 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23216996-77d2-40e7-a29c-b43247b1fb15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.085589 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:46:39 crc kubenswrapper[4750]: W1008 19:46:39.091603 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a8c7c05_0b87_4059_9184_111d44c1e83b.slice/crio-1c2dfede709f7ba1eaf80ec8dce636fe541e4a9b67234b74c4b3054a9a9a54f4 WatchSource:0}: Error finding container 1c2dfede709f7ba1eaf80ec8dce636fe541e4a9b67234b74c4b3054a9a9a54f4: Status 404 returned error can't find the container with id 1c2dfede709f7ba1eaf80ec8dce636fe541e4a9b67234b74c4b3054a9a9a54f4 Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.180742 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5a8c7c05-0b87-4059-9184-111d44c1e83b","Type":"ContainerStarted","Data":"1c2dfede709f7ba1eaf80ec8dce636fe541e4a9b67234b74c4b3054a9a9a54f4"} Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.189767 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x9l8h" event={"ID":"23216996-77d2-40e7-a29c-b43247b1fb15","Type":"ContainerDied","Data":"e6968771561b8666726c3d82f6ef9fe775052e7a559af6a428479a078a91b4e3"} Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.189819 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6968771561b8666726c3d82f6ef9fe775052e7a559af6a428479a078a91b4e3" Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.189846 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x9l8h" Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.396435 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.400341 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerName="nova-api-log" containerID="cri-o://811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292" gracePeriod=30 Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.400466 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerName="nova-api-api" containerID="cri-o://9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6" gracePeriod=30 Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.461187 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.461467 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c959de61-3bdb-4787-976f-4da974423d1a" containerName="nova-scheduler-scheduler" containerID="cri-o://bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13" gracePeriod=30 Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.477893 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.478254 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerName="nova-metadata-log" containerID="cri-o://2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e" gracePeriod=30 Oct 08 19:46:39 crc kubenswrapper[4750]: I1008 19:46:39.478372 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerName="nova-metadata-metadata" containerID="cri-o://9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1" gracePeriod=30 Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.000142 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.014771 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.089881 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d0e5c35-3529-4151-8585-cd42d5b114af-logs\") pod \"2d0e5c35-3529-4151-8585-cd42d5b114af\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.089972 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-combined-ca-bundle\") pod \"2d0e5c35-3529-4151-8585-cd42d5b114af\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.090026 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-config-data\") pod \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.090120 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-config-data\") pod \"2d0e5c35-3529-4151-8585-cd42d5b114af\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.090167 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26823f02-cb94-46e2-ac86-0d0a2ee50c11-logs\") pod \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.090250 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjrkx\" (UniqueName: \"kubernetes.io/projected/26823f02-cb94-46e2-ac86-0d0a2ee50c11-kube-api-access-tjrkx\") pod \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.090304 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-combined-ca-bundle\") pod \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\" (UID: \"26823f02-cb94-46e2-ac86-0d0a2ee50c11\") " Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.090413 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6r7c\" (UniqueName: \"kubernetes.io/projected/2d0e5c35-3529-4151-8585-cd42d5b114af-kube-api-access-k6r7c\") pod \"2d0e5c35-3529-4151-8585-cd42d5b114af\" (UID: \"2d0e5c35-3529-4151-8585-cd42d5b114af\") " Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.093919 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26823f02-cb94-46e2-ac86-0d0a2ee50c11-logs" (OuterVolumeSpecName: "logs") pod "26823f02-cb94-46e2-ac86-0d0a2ee50c11" (UID: "26823f02-cb94-46e2-ac86-0d0a2ee50c11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.093931 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d0e5c35-3529-4151-8585-cd42d5b114af-logs" (OuterVolumeSpecName: "logs") pod "2d0e5c35-3529-4151-8585-cd42d5b114af" (UID: "2d0e5c35-3529-4151-8585-cd42d5b114af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.099640 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26823f02-cb94-46e2-ac86-0d0a2ee50c11-kube-api-access-tjrkx" (OuterVolumeSpecName: "kube-api-access-tjrkx") pod "26823f02-cb94-46e2-ac86-0d0a2ee50c11" (UID: "26823f02-cb94-46e2-ac86-0d0a2ee50c11"). InnerVolumeSpecName "kube-api-access-tjrkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.099759 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d0e5c35-3529-4151-8585-cd42d5b114af-kube-api-access-k6r7c" (OuterVolumeSpecName: "kube-api-access-k6r7c") pod "2d0e5c35-3529-4151-8585-cd42d5b114af" (UID: "2d0e5c35-3529-4151-8585-cd42d5b114af"). InnerVolumeSpecName "kube-api-access-k6r7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.123859 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26823f02-cb94-46e2-ac86-0d0a2ee50c11" (UID: "26823f02-cb94-46e2-ac86-0d0a2ee50c11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.125675 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-config-data" (OuterVolumeSpecName: "config-data") pod "26823f02-cb94-46e2-ac86-0d0a2ee50c11" (UID: "26823f02-cb94-46e2-ac86-0d0a2ee50c11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.125970 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d0e5c35-3529-4151-8585-cd42d5b114af" (UID: "2d0e5c35-3529-4151-8585-cd42d5b114af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.127602 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-config-data" (OuterVolumeSpecName: "config-data") pod "2d0e5c35-3529-4151-8585-cd42d5b114af" (UID: "2d0e5c35-3529-4151-8585-cd42d5b114af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.193401 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26823f02-cb94-46e2-ac86-0d0a2ee50c11-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.193446 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjrkx\" (UniqueName: \"kubernetes.io/projected/26823f02-cb94-46e2-ac86-0d0a2ee50c11-kube-api-access-tjrkx\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.193457 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.193467 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6r7c\" (UniqueName: \"kubernetes.io/projected/2d0e5c35-3529-4151-8585-cd42d5b114af-kube-api-access-k6r7c\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.193476 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d0e5c35-3529-4151-8585-cd42d5b114af-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.193488 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.193498 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26823f02-cb94-46e2-ac86-0d0a2ee50c11-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.193509 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d0e5c35-3529-4151-8585-cd42d5b114af-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.202149 4750 generic.go:334] "Generic (PLEG): container finished" podID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerID="9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1" exitCode=0 Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.202194 4750 generic.go:334] "Generic (PLEG): container finished" podID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerID="2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e" exitCode=143 Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.202264 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"26823f02-cb94-46e2-ac86-0d0a2ee50c11","Type":"ContainerDied","Data":"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1"} Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.202296 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.202333 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"26823f02-cb94-46e2-ac86-0d0a2ee50c11","Type":"ContainerDied","Data":"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e"} Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.202360 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"26823f02-cb94-46e2-ac86-0d0a2ee50c11","Type":"ContainerDied","Data":"6757359bdcdf5779add44842e4883cc689403e158aab7d3241f9b73459e18d1a"} Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.202400 4750 scope.go:117] "RemoveContainer" containerID="9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.204139 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5a8c7c05-0b87-4059-9184-111d44c1e83b","Type":"ContainerStarted","Data":"e1a1a8b77e3fdff45ea66080f6f9034e48092169ea6217c14a1ce1ac9916b5d7"} Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.205576 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.208612 4750 generic.go:334] "Generic (PLEG): container finished" podID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerID="9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6" exitCode=0 Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.208640 4750 generic.go:334] "Generic (PLEG): container finished" podID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerID="811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292" exitCode=143 Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.208660 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0e5c35-3529-4151-8585-cd42d5b114af","Type":"ContainerDied","Data":"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6"} Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.208727 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0e5c35-3529-4151-8585-cd42d5b114af","Type":"ContainerDied","Data":"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292"} Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.208746 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d0e5c35-3529-4151-8585-cd42d5b114af","Type":"ContainerDied","Data":"a26c0332905ab4d0cc107b36937c4debce9c12c753d490793c07e17da1f179eb"} Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.208849 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.231333 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.231305698 podStartE2EDuration="2.231305698s" podCreationTimestamp="2025-10-08 19:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:40.22295569 +0000 UTC m=+5756.135926723" watchObservedRunningTime="2025-10-08 19:46:40.231305698 +0000 UTC m=+5756.144276711" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.259747 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.276732 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.286675 4750 scope.go:117] "RemoveContainer" containerID="2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.335755 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.337406 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerName="nova-api-api" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.337449 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerName="nova-api-api" Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.337478 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerName="nova-api-log" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.337486 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerName="nova-api-log" Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.337532 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerName="nova-metadata-metadata" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.337540 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerName="nova-metadata-metadata" Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.337574 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerName="nova-metadata-log" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.337585 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerName="nova-metadata-log" Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.337615 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23216996-77d2-40e7-a29c-b43247b1fb15" containerName="nova-manage" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.337621 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="23216996-77d2-40e7-a29c-b43247b1fb15" containerName="nova-manage" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.338049 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerName="nova-metadata-metadata" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.338090 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="23216996-77d2-40e7-a29c-b43247b1fb15" containerName="nova-manage" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.338099 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerName="nova-api-log" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.338108 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" containerName="nova-api-api" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.338134 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" containerName="nova-metadata-log" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.340290 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.343928 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.372312 4750 scope.go:117] "RemoveContainer" containerID="9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1" Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.373634 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1\": container with ID starting with 9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1 not found: ID does not exist" containerID="9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.373926 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1"} err="failed to get container status \"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1\": rpc error: code = NotFound desc = could not find container \"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1\": container with ID starting with 9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1 not found: ID does not exist" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.373957 4750 scope.go:117] "RemoveContainer" containerID="2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e" Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.374615 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e\": container with ID starting with 2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e not found: ID does not exist" containerID="2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.374653 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e"} err="failed to get container status \"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e\": rpc error: code = NotFound desc = could not find container \"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e\": container with ID starting with 2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e not found: ID does not exist" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.374671 4750 scope.go:117] "RemoveContainer" containerID="9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.374935 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1"} err="failed to get container status \"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1\": rpc error: code = NotFound desc = could not find container \"9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1\": container with ID starting with 9176b275306b1e8bb39a9f52fac5376040a1d0d1d803fc22cd35a5b5302a09f1 not found: ID does not exist" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.374954 4750 scope.go:117] "RemoveContainer" containerID="2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.375185 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e"} err="failed to get container status \"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e\": rpc error: code = NotFound desc = could not find container \"2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e\": container with ID starting with 2ee27e7d46ed6497053e272ee09c78c203913d016606de7fb85c8ff5f223884e not found: ID does not exist" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.375203 4750 scope.go:117] "RemoveContainer" containerID="9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.375347 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.391695 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.403005 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.410996 4750 scope.go:117] "RemoveContainer" containerID="811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.411907 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.413698 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.419119 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.421838 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-config-data\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.421981 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2t7z\" (UniqueName: \"kubernetes.io/projected/01da7f48-bcbb-45fc-8c00-249c99acb4e3-kube-api-access-g2t7z\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.422024 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.422080 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da7f48-bcbb-45fc-8c00-249c99acb4e3-logs\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.423603 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.437671 4750 scope.go:117] "RemoveContainer" containerID="9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6" Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.438080 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6\": container with ID starting with 9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6 not found: ID does not exist" containerID="9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.438117 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6"} err="failed to get container status \"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6\": rpc error: code = NotFound desc = could not find container \"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6\": container with ID starting with 9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6 not found: ID does not exist" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.438144 4750 scope.go:117] "RemoveContainer" containerID="811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292" Oct 08 19:46:40 crc kubenswrapper[4750]: E1008 19:46:40.438358 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292\": container with ID starting with 811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292 not found: ID does not exist" containerID="811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.438385 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292"} err="failed to get container status \"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292\": rpc error: code = NotFound desc = could not find container \"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292\": container with ID starting with 811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292 not found: ID does not exist" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.438401 4750 scope.go:117] "RemoveContainer" containerID="9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.438617 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6"} err="failed to get container status \"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6\": rpc error: code = NotFound desc = could not find container \"9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6\": container with ID starting with 9499795309e3b647fc4b973f30ddf29378d5b21e0eb4a39a4eb41ccda70480a6 not found: ID does not exist" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.438636 4750 scope.go:117] "RemoveContainer" containerID="811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.438851 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292"} err="failed to get container status \"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292\": rpc error: code = NotFound desc = could not find container \"811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292\": container with ID starting with 811a8d3fe25b3dcba2201b9e995712622660b450539d0c9e157e172dfdabb292 not found: ID does not exist" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.523591 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-config-data\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.523666 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-config-data\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.523703 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2t7z\" (UniqueName: \"kubernetes.io/projected/01da7f48-bcbb-45fc-8c00-249c99acb4e3-kube-api-access-g2t7z\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.523728 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.523774 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da7f48-bcbb-45fc-8c00-249c99acb4e3-logs\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.523796 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbs7\" (UniqueName: \"kubernetes.io/projected/01131122-c4cf-4683-a2ad-18028e12e3cb-kube-api-access-4rbs7\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.523825 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.523846 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01131122-c4cf-4683-a2ad-18028e12e3cb-logs\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.525336 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da7f48-bcbb-45fc-8c00-249c99acb4e3-logs\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.534614 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-config-data\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.537475 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.543127 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2t7z\" (UniqueName: \"kubernetes.io/projected/01da7f48-bcbb-45fc-8c00-249c99acb4e3-kube-api-access-g2t7z\") pod \"nova-metadata-0\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.609570 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.625463 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.625532 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-config-data\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.625620 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbs7\" (UniqueName: \"kubernetes.io/projected/01131122-c4cf-4683-a2ad-18028e12e3cb-kube-api-access-4rbs7\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.625655 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01131122-c4cf-4683-a2ad-18028e12e3cb-logs\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.625671 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.626706 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01131122-c4cf-4683-a2ad-18028e12e3cb-logs\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.630981 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.633379 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-config-data\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.646958 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbs7\" (UniqueName: \"kubernetes.io/projected/01131122-c4cf-4683-a2ad-18028e12e3cb-kube-api-access-4rbs7\") pod \"nova-api-0\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.674807 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.737544 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.755313 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26823f02-cb94-46e2-ac86-0d0a2ee50c11" path="/var/lib/kubelet/pods/26823f02-cb94-46e2-ac86-0d0a2ee50c11/volumes" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.756440 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d0e5c35-3529-4151-8585-cd42d5b114af" path="/var/lib/kubelet/pods/2d0e5c35-3529-4151-8585-cd42d5b114af/volumes" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.824915 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.907744 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f476f79f5-78nfg"] Oct 08 19:46:40 crc kubenswrapper[4750]: I1008 19:46:40.908390 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" podUID="217c91ff-d9a5-4349-a5a8-36e593581c92" containerName="dnsmasq-dns" containerID="cri-o://b62c940475668778ef6696dae355e3efec8f1f3092dfa5566b06bbfe9eee04d7" gracePeriod=10 Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.236488 4750 generic.go:334] "Generic (PLEG): container finished" podID="217c91ff-d9a5-4349-a5a8-36e593581c92" containerID="b62c940475668778ef6696dae355e3efec8f1f3092dfa5566b06bbfe9eee04d7" exitCode=0 Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.236656 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" event={"ID":"217c91ff-d9a5-4349-a5a8-36e593581c92","Type":"ContainerDied","Data":"b62c940475668778ef6696dae355e3efec8f1f3092dfa5566b06bbfe9eee04d7"} Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.272992 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.275069 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:41 crc kubenswrapper[4750]: W1008 19:46:41.277400 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01da7f48_bcbb_45fc_8c00_249c99acb4e3.slice/crio-de2d03b2c598aadb19345179fbd3880aced9d75857af1478cc296e8220ed7700 WatchSource:0}: Error finding container de2d03b2c598aadb19345179fbd3880aced9d75857af1478cc296e8220ed7700: Status 404 returned error can't find the container with id de2d03b2c598aadb19345179fbd3880aced9d75857af1478cc296e8220ed7700 Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.342865 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.414741 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.558777 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-nb\") pod \"217c91ff-d9a5-4349-a5a8-36e593581c92\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.558892 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-config\") pod \"217c91ff-d9a5-4349-a5a8-36e593581c92\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.558938 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-dns-svc\") pod \"217c91ff-d9a5-4349-a5a8-36e593581c92\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.559119 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-sb\") pod \"217c91ff-d9a5-4349-a5a8-36e593581c92\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.559163 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frtvh\" (UniqueName: \"kubernetes.io/projected/217c91ff-d9a5-4349-a5a8-36e593581c92-kube-api-access-frtvh\") pod \"217c91ff-d9a5-4349-a5a8-36e593581c92\" (UID: \"217c91ff-d9a5-4349-a5a8-36e593581c92\") " Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.575980 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217c91ff-d9a5-4349-a5a8-36e593581c92-kube-api-access-frtvh" (OuterVolumeSpecName: "kube-api-access-frtvh") pod "217c91ff-d9a5-4349-a5a8-36e593581c92" (UID: "217c91ff-d9a5-4349-a5a8-36e593581c92"). InnerVolumeSpecName "kube-api-access-frtvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.621411 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "217c91ff-d9a5-4349-a5a8-36e593581c92" (UID: "217c91ff-d9a5-4349-a5a8-36e593581c92"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.631500 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "217c91ff-d9a5-4349-a5a8-36e593581c92" (UID: "217c91ff-d9a5-4349-a5a8-36e593581c92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.637954 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "217c91ff-d9a5-4349-a5a8-36e593581c92" (UID: "217c91ff-d9a5-4349-a5a8-36e593581c92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.643921 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-config" (OuterVolumeSpecName: "config") pod "217c91ff-d9a5-4349-a5a8-36e593581c92" (UID: "217c91ff-d9a5-4349-a5a8-36e593581c92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.661105 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.661133 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.661147 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.661159 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c91ff-d9a5-4349-a5a8-36e593581c92-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:41 crc kubenswrapper[4750]: I1008 19:46:41.661169 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frtvh\" (UniqueName: \"kubernetes.io/projected/217c91ff-d9a5-4349-a5a8-36e593581c92-kube-api-access-frtvh\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.274980 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01da7f48-bcbb-45fc-8c00-249c99acb4e3","Type":"ContainerStarted","Data":"8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2"} Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.275402 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01da7f48-bcbb-45fc-8c00-249c99acb4e3","Type":"ContainerStarted","Data":"182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc"} Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.275418 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01da7f48-bcbb-45fc-8c00-249c99acb4e3","Type":"ContainerStarted","Data":"de2d03b2c598aadb19345179fbd3880aced9d75857af1478cc296e8220ed7700"} Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.282387 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01131122-c4cf-4683-a2ad-18028e12e3cb","Type":"ContainerStarted","Data":"0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287"} Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.282475 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01131122-c4cf-4683-a2ad-18028e12e3cb","Type":"ContainerStarted","Data":"86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0"} Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.282508 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01131122-c4cf-4683-a2ad-18028e12e3cb","Type":"ContainerStarted","Data":"626aae64b2170f958d2724055ce3e8662780e8dd8581338e02c021eea535d931"} Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.286296 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.286464 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f476f79f5-78nfg" event={"ID":"217c91ff-d9a5-4349-a5a8-36e593581c92","Type":"ContainerDied","Data":"d3b1eed7e1b4580cecf94cfac7dc2215a66b42450ed2f7a0df731ce4c617d37d"} Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.286512 4750 scope.go:117] "RemoveContainer" containerID="b62c940475668778ef6696dae355e3efec8f1f3092dfa5566b06bbfe9eee04d7" Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.313578 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.313532689 podStartE2EDuration="2.313532689s" podCreationTimestamp="2025-10-08 19:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:42.299410168 +0000 UTC m=+5758.212381181" watchObservedRunningTime="2025-10-08 19:46:42.313532689 +0000 UTC m=+5758.226503702" Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.343421 4750 scope.go:117] "RemoveContainer" containerID="9fdba9f656ebebf2ae6febcd687a5d74d71a878b9b3bdedb83276d7cadb4c96b" Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.364644 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.364605483 podStartE2EDuration="2.364605483s" podCreationTimestamp="2025-10-08 19:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:42.329396984 +0000 UTC m=+5758.242367997" watchObservedRunningTime="2025-10-08 19:46:42.364605483 +0000 UTC m=+5758.277576596" Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.380375 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f476f79f5-78nfg"] Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.389806 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f476f79f5-78nfg"] Oct 08 19:46:42 crc kubenswrapper[4750]: I1008 19:46:42.773453 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217c91ff-d9a5-4349-a5a8-36e593581c92" path="/var/lib/kubelet/pods/217c91ff-d9a5-4349-a5a8-36e593581c92/volumes" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.016838 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.121213 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-combined-ca-bundle\") pod \"c959de61-3bdb-4787-976f-4da974423d1a\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.121356 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k586h\" (UniqueName: \"kubernetes.io/projected/c959de61-3bdb-4787-976f-4da974423d1a-kube-api-access-k586h\") pod \"c959de61-3bdb-4787-976f-4da974423d1a\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.121616 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-config-data\") pod \"c959de61-3bdb-4787-976f-4da974423d1a\" (UID: \"c959de61-3bdb-4787-976f-4da974423d1a\") " Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.128404 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c959de61-3bdb-4787-976f-4da974423d1a-kube-api-access-k586h" (OuterVolumeSpecName: "kube-api-access-k586h") pod "c959de61-3bdb-4787-976f-4da974423d1a" (UID: "c959de61-3bdb-4787-976f-4da974423d1a"). InnerVolumeSpecName "kube-api-access-k586h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.148488 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-config-data" (OuterVolumeSpecName: "config-data") pod "c959de61-3bdb-4787-976f-4da974423d1a" (UID: "c959de61-3bdb-4787-976f-4da974423d1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.166031 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c959de61-3bdb-4787-976f-4da974423d1a" (UID: "c959de61-3bdb-4787-976f-4da974423d1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.225294 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.225360 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k586h\" (UniqueName: \"kubernetes.io/projected/c959de61-3bdb-4787-976f-4da974423d1a-kube-api-access-k586h\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.225384 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c959de61-3bdb-4787-976f-4da974423d1a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.317946 4750 generic.go:334] "Generic (PLEG): container finished" podID="c959de61-3bdb-4787-976f-4da974423d1a" containerID="bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13" exitCode=0 Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.318022 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c959de61-3bdb-4787-976f-4da974423d1a","Type":"ContainerDied","Data":"bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13"} Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.318067 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c959de61-3bdb-4787-976f-4da974423d1a","Type":"ContainerDied","Data":"3760b5c4940a7aa5c8453d5174fd387f4527c7fe1f62ae00b2fbb39cd9e1a35f"} Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.318101 4750 scope.go:117] "RemoveContainer" containerID="bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.318122 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.363011 4750 scope.go:117] "RemoveContainer" containerID="bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13" Oct 08 19:46:44 crc kubenswrapper[4750]: E1008 19:46:44.363974 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13\": container with ID starting with bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13 not found: ID does not exist" containerID="bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.364039 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13"} err="failed to get container status \"bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13\": rpc error: code = NotFound desc = could not find container \"bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13\": container with ID starting with bfde87cd9a19b621248bae72102ce50a5f07677e744ae7efc514d9a06cb2ea13 not found: ID does not exist" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.387187 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.401677 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.423762 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:44 crc kubenswrapper[4750]: E1008 19:46:44.424304 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c959de61-3bdb-4787-976f-4da974423d1a" containerName="nova-scheduler-scheduler" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.424329 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c959de61-3bdb-4787-976f-4da974423d1a" containerName="nova-scheduler-scheduler" Oct 08 19:46:44 crc kubenswrapper[4750]: E1008 19:46:44.424386 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217c91ff-d9a5-4349-a5a8-36e593581c92" containerName="init" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.424396 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="217c91ff-d9a5-4349-a5a8-36e593581c92" containerName="init" Oct 08 19:46:44 crc kubenswrapper[4750]: E1008 19:46:44.424408 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217c91ff-d9a5-4349-a5a8-36e593581c92" containerName="dnsmasq-dns" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.424419 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="217c91ff-d9a5-4349-a5a8-36e593581c92" containerName="dnsmasq-dns" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.424674 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c959de61-3bdb-4787-976f-4da974423d1a" containerName="nova-scheduler-scheduler" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.424713 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="217c91ff-d9a5-4349-a5a8-36e593581c92" containerName="dnsmasq-dns" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.425634 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.428538 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.490522 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.532679 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-config-data\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.532731 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.533329 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7j2\" (UniqueName: \"kubernetes.io/projected/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-kube-api-access-mp7j2\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.636134 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-config-data\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.636218 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.636616 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7j2\" (UniqueName: \"kubernetes.io/projected/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-kube-api-access-mp7j2\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.641358 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.656780 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7j2\" (UniqueName: \"kubernetes.io/projected/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-kube-api-access-mp7j2\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.658574 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-config-data\") pod \"nova-scheduler-0\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " pod="openstack/nova-scheduler-0" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.748050 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c959de61-3bdb-4787-976f-4da974423d1a" path="/var/lib/kubelet/pods/c959de61-3bdb-4787-976f-4da974423d1a/volumes" Oct 08 19:46:44 crc kubenswrapper[4750]: I1008 19:46:44.801538 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:46:45 crc kubenswrapper[4750]: I1008 19:46:45.279126 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:45 crc kubenswrapper[4750]: W1008 19:46:45.280057 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ba54c0d_9ca9_4b59_a97b_ecfb026b624f.slice/crio-211284acfb8d5feef1f67643de050181fad09d5873f352958d693658f3a759c7 WatchSource:0}: Error finding container 211284acfb8d5feef1f67643de050181fad09d5873f352958d693658f3a759c7: Status 404 returned error can't find the container with id 211284acfb8d5feef1f67643de050181fad09d5873f352958d693658f3a759c7 Oct 08 19:46:45 crc kubenswrapper[4750]: I1008 19:46:45.332928 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f","Type":"ContainerStarted","Data":"211284acfb8d5feef1f67643de050181fad09d5873f352958d693658f3a759c7"} Oct 08 19:46:45 crc kubenswrapper[4750]: I1008 19:46:45.675323 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 19:46:45 crc kubenswrapper[4750]: I1008 19:46:45.676035 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 19:46:46 crc kubenswrapper[4750]: I1008 19:46:46.353711 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f","Type":"ContainerStarted","Data":"a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f"} Oct 08 19:46:46 crc kubenswrapper[4750]: I1008 19:46:46.395670 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.3956355289999998 podStartE2EDuration="2.395635529s" podCreationTimestamp="2025-10-08 19:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:46.379658291 +0000 UTC m=+5762.292629374" watchObservedRunningTime="2025-10-08 19:46:46.395635529 +0000 UTC m=+5762.308606572" Oct 08 19:46:48 crc kubenswrapper[4750]: I1008 19:46:48.643229 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.423133 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-cdhgn"] Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.426532 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.429963 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.430319 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.432536 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-cdhgn"] Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.453858 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.454045 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8g48\" (UniqueName: \"kubernetes.io/projected/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-kube-api-access-q8g48\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.454164 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-scripts\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.454211 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-config-data\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.556111 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8g48\" (UniqueName: \"kubernetes.io/projected/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-kube-api-access-q8g48\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.556332 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-scripts\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.556390 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-config-data\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.556450 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.574385 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-scripts\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.575518 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.577881 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-config-data\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.579173 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8g48\" (UniqueName: \"kubernetes.io/projected/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-kube-api-access-q8g48\") pod \"nova-cell1-cell-mapping-cdhgn\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.766611 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:49 crc kubenswrapper[4750]: I1008 19:46:49.803478 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 19:46:50 crc kubenswrapper[4750]: I1008 19:46:50.248411 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-cdhgn"] Oct 08 19:46:50 crc kubenswrapper[4750]: I1008 19:46:50.407288 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cdhgn" event={"ID":"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60","Type":"ContainerStarted","Data":"5da70d6962e24c3fa30aa6edb9834f23339c9a1e35efb88a2c09ab32b6f55b83"} Oct 08 19:46:50 crc kubenswrapper[4750]: I1008 19:46:50.675600 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 19:46:50 crc kubenswrapper[4750]: I1008 19:46:50.675664 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 19:46:50 crc kubenswrapper[4750]: I1008 19:46:50.746624 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 19:46:50 crc kubenswrapper[4750]: I1008 19:46:50.746669 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 19:46:51 crc kubenswrapper[4750]: I1008 19:46:51.424157 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cdhgn" event={"ID":"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60","Type":"ContainerStarted","Data":"b1c02d86ae5972fbca6fb66bed397cead160ab86189a95ecb7adc6f6132119ac"} Oct 08 19:46:51 crc kubenswrapper[4750]: I1008 19:46:51.457816 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-cdhgn" podStartSLOduration=2.457780529 podStartE2EDuration="2.457780529s" podCreationTimestamp="2025-10-08 19:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:46:51.45141551 +0000 UTC m=+5767.364386533" watchObservedRunningTime="2025-10-08 19:46:51.457780529 +0000 UTC m=+5767.370751552" Oct 08 19:46:51 crc kubenswrapper[4750]: I1008 19:46:51.757954 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:46:51 crc kubenswrapper[4750]: I1008 19:46:51.758290 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:46:51 crc kubenswrapper[4750]: I1008 19:46:51.840001 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:46:51 crc kubenswrapper[4750]: I1008 19:46:51.840221 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:46:54 crc kubenswrapper[4750]: I1008 19:46:54.803191 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 19:46:54 crc kubenswrapper[4750]: I1008 19:46:54.858522 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 19:46:55 crc kubenswrapper[4750]: I1008 19:46:55.474700 4750 generic.go:334] "Generic (PLEG): container finished" podID="56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" containerID="b1c02d86ae5972fbca6fb66bed397cead160ab86189a95ecb7adc6f6132119ac" exitCode=0 Oct 08 19:46:55 crc kubenswrapper[4750]: I1008 19:46:55.475397 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cdhgn" event={"ID":"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60","Type":"ContainerDied","Data":"b1c02d86ae5972fbca6fb66bed397cead160ab86189a95ecb7adc6f6132119ac"} Oct 08 19:46:55 crc kubenswrapper[4750]: I1008 19:46:55.522637 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 19:46:56 crc kubenswrapper[4750]: I1008 19:46:56.861512 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:56 crc kubenswrapper[4750]: I1008 19:46:56.919409 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-scripts\") pod \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " Oct 08 19:46:56 crc kubenswrapper[4750]: I1008 19:46:56.919603 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-config-data\") pod \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " Oct 08 19:46:56 crc kubenswrapper[4750]: I1008 19:46:56.919651 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8g48\" (UniqueName: \"kubernetes.io/projected/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-kube-api-access-q8g48\") pod \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " Oct 08 19:46:56 crc kubenswrapper[4750]: I1008 19:46:56.919671 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-combined-ca-bundle\") pod \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\" (UID: \"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60\") " Oct 08 19:46:56 crc kubenswrapper[4750]: I1008 19:46:56.927323 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-scripts" (OuterVolumeSpecName: "scripts") pod "56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" (UID: "56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:56 crc kubenswrapper[4750]: I1008 19:46:56.929775 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-kube-api-access-q8g48" (OuterVolumeSpecName: "kube-api-access-q8g48") pod "56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" (UID: "56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60"). InnerVolumeSpecName "kube-api-access-q8g48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.010886 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" (UID: "56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.010996 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-config-data" (OuterVolumeSpecName: "config-data") pod "56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" (UID: "56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.023899 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.023926 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.023941 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8g48\" (UniqueName: \"kubernetes.io/projected/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-kube-api-access-q8g48\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.023951 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.499587 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cdhgn" event={"ID":"56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60","Type":"ContainerDied","Data":"5da70d6962e24c3fa30aa6edb9834f23339c9a1e35efb88a2c09ab32b6f55b83"} Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.499633 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5da70d6962e24c3fa30aa6edb9834f23339c9a1e35efb88a2c09ab32b6f55b83" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.499658 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cdhgn" Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.762717 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.763162 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-log" containerID="cri-o://86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0" gracePeriod=30 Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.763197 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-api" containerID="cri-o://0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287" gracePeriod=30 Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.776447 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.776711 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" containerName="nova-scheduler-scheduler" containerID="cri-o://a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f" gracePeriod=30 Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.795458 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.795822 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-log" containerID="cri-o://182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc" gracePeriod=30 Oct 08 19:46:57 crc kubenswrapper[4750]: I1008 19:46:57.795935 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-metadata" containerID="cri-o://8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2" gracePeriod=30 Oct 08 19:46:58 crc kubenswrapper[4750]: I1008 19:46:58.511614 4750 generic.go:334] "Generic (PLEG): container finished" podID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerID="182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc" exitCode=143 Oct 08 19:46:58 crc kubenswrapper[4750]: I1008 19:46:58.511710 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01da7f48-bcbb-45fc-8c00-249c99acb4e3","Type":"ContainerDied","Data":"182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc"} Oct 08 19:46:58 crc kubenswrapper[4750]: I1008 19:46:58.515347 4750 generic.go:334] "Generic (PLEG): container finished" podID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerID="86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0" exitCode=143 Oct 08 19:46:58 crc kubenswrapper[4750]: I1008 19:46:58.515407 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01131122-c4cf-4683-a2ad-18028e12e3cb","Type":"ContainerDied","Data":"86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0"} Oct 08 19:46:59 crc kubenswrapper[4750]: I1008 19:46:59.706763 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:46:59 crc kubenswrapper[4750]: I1008 19:46:59.707145 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:46:59 crc kubenswrapper[4750]: E1008 19:46:59.805660 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 19:46:59 crc kubenswrapper[4750]: E1008 19:46:59.807118 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 19:46:59 crc kubenswrapper[4750]: E1008 19:46:59.808906 4750 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 19:46:59 crc kubenswrapper[4750]: E1008 19:46:59.808998 4750 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" containerName="nova-scheduler-scheduler" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.475263 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.484579 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.553526 4750 generic.go:334] "Generic (PLEG): container finished" podID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerID="0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287" exitCode=0 Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.553625 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01131122-c4cf-4683-a2ad-18028e12e3cb","Type":"ContainerDied","Data":"0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287"} Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.553700 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01131122-c4cf-4683-a2ad-18028e12e3cb","Type":"ContainerDied","Data":"626aae64b2170f958d2724055ce3e8662780e8dd8581338e02c021eea535d931"} Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.553729 4750 scope.go:117] "RemoveContainer" containerID="0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.553645 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.556739 4750 generic.go:334] "Generic (PLEG): container finished" podID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerID="8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2" exitCode=0 Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.556806 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01da7f48-bcbb-45fc-8c00-249c99acb4e3","Type":"ContainerDied","Data":"8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2"} Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.556859 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01da7f48-bcbb-45fc-8c00-249c99acb4e3","Type":"ContainerDied","Data":"de2d03b2c598aadb19345179fbd3880aced9d75857af1478cc296e8220ed7700"} Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.556952 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.587498 4750 scope.go:117] "RemoveContainer" containerID="86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.615868 4750 scope.go:117] "RemoveContainer" containerID="0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287" Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.616421 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287\": container with ID starting with 0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287 not found: ID does not exist" containerID="0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.616458 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287"} err="failed to get container status \"0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287\": rpc error: code = NotFound desc = could not find container \"0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287\": container with ID starting with 0aa24ec9e0bc7c6f41e356d7132035f0e2d693a2ac20925a1be05bc1ef73c287 not found: ID does not exist" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.616491 4750 scope.go:117] "RemoveContainer" containerID="86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0" Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.616771 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0\": container with ID starting with 86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0 not found: ID does not exist" containerID="86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.616796 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0"} err="failed to get container status \"86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0\": rpc error: code = NotFound desc = could not find container \"86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0\": container with ID starting with 86cf375b71e1150ccb8112424dfe5db2531779bf0b0d1d5eaa87b9b1f3e48ef0 not found: ID does not exist" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.616814 4750 scope.go:117] "RemoveContainer" containerID="8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.635794 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da7f48-bcbb-45fc-8c00-249c99acb4e3-logs\") pod \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.636142 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-config-data\") pod \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.636451 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-combined-ca-bundle\") pod \"01131122-c4cf-4683-a2ad-18028e12e3cb\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.636638 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-combined-ca-bundle\") pod \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.636778 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01131122-c4cf-4683-a2ad-18028e12e3cb-logs\") pod \"01131122-c4cf-4683-a2ad-18028e12e3cb\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.636920 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-config-data\") pod \"01131122-c4cf-4683-a2ad-18028e12e3cb\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.637099 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rbs7\" (UniqueName: \"kubernetes.io/projected/01131122-c4cf-4683-a2ad-18028e12e3cb-kube-api-access-4rbs7\") pod \"01131122-c4cf-4683-a2ad-18028e12e3cb\" (UID: \"01131122-c4cf-4683-a2ad-18028e12e3cb\") " Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.637862 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2t7z\" (UniqueName: \"kubernetes.io/projected/01da7f48-bcbb-45fc-8c00-249c99acb4e3-kube-api-access-g2t7z\") pod \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\" (UID: \"01da7f48-bcbb-45fc-8c00-249c99acb4e3\") " Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.637883 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01da7f48-bcbb-45fc-8c00-249c99acb4e3-logs" (OuterVolumeSpecName: "logs") pod "01da7f48-bcbb-45fc-8c00-249c99acb4e3" (UID: "01da7f48-bcbb-45fc-8c00-249c99acb4e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.638504 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01131122-c4cf-4683-a2ad-18028e12e3cb-logs" (OuterVolumeSpecName: "logs") pod "01131122-c4cf-4683-a2ad-18028e12e3cb" (UID: "01131122-c4cf-4683-a2ad-18028e12e3cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.639809 4750 scope.go:117] "RemoveContainer" containerID="182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.639990 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01131122-c4cf-4683-a2ad-18028e12e3cb-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.640089 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da7f48-bcbb-45fc-8c00-249c99acb4e3-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.647775 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01da7f48-bcbb-45fc-8c00-249c99acb4e3-kube-api-access-g2t7z" (OuterVolumeSpecName: "kube-api-access-g2t7z") pod "01da7f48-bcbb-45fc-8c00-249c99acb4e3" (UID: "01da7f48-bcbb-45fc-8c00-249c99acb4e3"). InnerVolumeSpecName "kube-api-access-g2t7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.657111 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01131122-c4cf-4683-a2ad-18028e12e3cb-kube-api-access-4rbs7" (OuterVolumeSpecName: "kube-api-access-4rbs7") pod "01131122-c4cf-4683-a2ad-18028e12e3cb" (UID: "01131122-c4cf-4683-a2ad-18028e12e3cb"). InnerVolumeSpecName "kube-api-access-4rbs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.671684 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-config-data" (OuterVolumeSpecName: "config-data") pod "01131122-c4cf-4683-a2ad-18028e12e3cb" (UID: "01131122-c4cf-4683-a2ad-18028e12e3cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.673662 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-config-data" (OuterVolumeSpecName: "config-data") pod "01da7f48-bcbb-45fc-8c00-249c99acb4e3" (UID: "01da7f48-bcbb-45fc-8c00-249c99acb4e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.674614 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01131122-c4cf-4683-a2ad-18028e12e3cb" (UID: "01131122-c4cf-4683-a2ad-18028e12e3cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.691789 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01da7f48-bcbb-45fc-8c00-249c99acb4e3" (UID: "01da7f48-bcbb-45fc-8c00-249c99acb4e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.739476 4750 scope.go:117] "RemoveContainer" containerID="8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2" Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.740206 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2\": container with ID starting with 8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2 not found: ID does not exist" containerID="8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.740272 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2"} err="failed to get container status \"8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2\": rpc error: code = NotFound desc = could not find container \"8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2\": container with ID starting with 8621aaea0c2fa7ca034ce60567dff027080c675ced293c9e44c92408eb82f5d2 not found: ID does not exist" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.740303 4750 scope.go:117] "RemoveContainer" containerID="182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc" Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.740891 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc\": container with ID starting with 182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc not found: ID does not exist" containerID="182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.741054 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc"} err="failed to get container status \"182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc\": rpc error: code = NotFound desc = could not find container \"182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc\": container with ID starting with 182cd54501c4c35a817de1654b7e962511887393a2543aec729456825d4ce4fc not found: ID does not exist" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.741415 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.741464 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.741482 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01131122-c4cf-4683-a2ad-18028e12e3cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.741500 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rbs7\" (UniqueName: \"kubernetes.io/projected/01131122-c4cf-4683-a2ad-18028e12e3cb-kube-api-access-4rbs7\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.741521 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2t7z\" (UniqueName: \"kubernetes.io/projected/01da7f48-bcbb-45fc-8c00-249c99acb4e3-kube-api-access-g2t7z\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.741538 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da7f48-bcbb-45fc-8c00-249c99acb4e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.921696 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.944466 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.962344 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.987090 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.995790 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.996508 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-log" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.996537 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-log" Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.996622 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-api" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.996633 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-api" Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.996650 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" containerName="nova-manage" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.996658 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" containerName="nova-manage" Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.996687 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-metadata" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.996696 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-metadata" Oct 08 19:47:01 crc kubenswrapper[4750]: E1008 19:47:01.996720 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-log" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.996727 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-log" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.996969 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-metadata" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.996986 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-api" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.996997 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" containerName="nova-manage" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.997008 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" containerName="nova-api-log" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.997029 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" containerName="nova-metadata-log" Oct 08 19:47:01 crc kubenswrapper[4750]: I1008 19:47:01.998673 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.003207 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.011102 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.014771 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.019055 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.028097 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.047094 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-logs\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.047210 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-config-data\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.047238 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clc58\" (UniqueName: \"kubernetes.io/projected/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-kube-api-access-clc58\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.047817 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.050518 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.151447 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-config-data\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.152299 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.152368 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-logs\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.152492 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw9zx\" (UniqueName: \"kubernetes.io/projected/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-kube-api-access-mw9zx\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.152643 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.152795 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-logs\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.153032 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-config-data\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.153064 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clc58\" (UniqueName: \"kubernetes.io/projected/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-kube-api-access-clc58\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.153622 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-logs\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.160376 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.165856 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-config-data\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.175156 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clc58\" (UniqueName: \"kubernetes.io/projected/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-kube-api-access-clc58\") pod \"nova-metadata-0\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.254562 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.254648 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-config-data\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.254728 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-logs\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.254789 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw9zx\" (UniqueName: \"kubernetes.io/projected/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-kube-api-access-mw9zx\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.255841 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-logs\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.260014 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.262211 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-config-data\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.278247 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw9zx\" (UniqueName: \"kubernetes.io/projected/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-kube-api-access-mw9zx\") pod \"nova-api-0\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.348775 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.356518 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-config-data\") pod \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.356783 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp7j2\" (UniqueName: \"kubernetes.io/projected/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-kube-api-access-mp7j2\") pod \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.357032 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-combined-ca-bundle\") pod \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\" (UID: \"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f\") " Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.362694 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-kube-api-access-mp7j2" (OuterVolumeSpecName: "kube-api-access-mp7j2") pod "9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" (UID: "9ba54c0d-9ca9-4b59-a97b-ecfb026b624f"). InnerVolumeSpecName "kube-api-access-mp7j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.388177 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.402832 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-config-data" (OuterVolumeSpecName: "config-data") pod "9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" (UID: "9ba54c0d-9ca9-4b59-a97b-ecfb026b624f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.403674 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.405725 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" (UID: "9ba54c0d-9ca9-4b59-a97b-ecfb026b624f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.459206 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.459253 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp7j2\" (UniqueName: \"kubernetes.io/projected/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-kube-api-access-mp7j2\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.459287 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.588056 4750 generic.go:334] "Generic (PLEG): container finished" podID="9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" containerID="a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f" exitCode=0 Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.588520 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f","Type":"ContainerDied","Data":"a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f"} Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.588587 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9ba54c0d-9ca9-4b59-a97b-ecfb026b624f","Type":"ContainerDied","Data":"211284acfb8d5feef1f67643de050181fad09d5873f352958d693658f3a759c7"} Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.588625 4750 scope.go:117] "RemoveContainer" containerID="a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.588840 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.643730 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.662612 4750 scope.go:117] "RemoveContainer" containerID="a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f" Oct 08 19:47:02 crc kubenswrapper[4750]: E1008 19:47:02.664686 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f\": container with ID starting with a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f not found: ID does not exist" containerID="a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.664721 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f"} err="failed to get container status \"a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f\": rpc error: code = NotFound desc = could not find container \"a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f\": container with ID starting with a74d070dca3b5dafc830305fad25f00491233540cfe76bdd0cc6e038e76fed4f not found: ID does not exist" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.670896 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.680895 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: E1008 19:47:02.681870 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" containerName="nova-scheduler-scheduler" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.681899 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" containerName="nova-scheduler-scheduler" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.682132 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" containerName="nova-scheduler-scheduler" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.683301 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.686019 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.694725 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.751225 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01131122-c4cf-4683-a2ad-18028e12e3cb" path="/var/lib/kubelet/pods/01131122-c4cf-4683-a2ad-18028e12e3cb/volumes" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.752678 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01da7f48-bcbb-45fc-8c00-249c99acb4e3" path="/var/lib/kubelet/pods/01da7f48-bcbb-45fc-8c00-249c99acb4e3/volumes" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.753406 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba54c0d-9ca9-4b59-a97b-ecfb026b624f" path="/var/lib/kubelet/pods/9ba54c0d-9ca9-4b59-a97b-ecfb026b624f/volumes" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.768164 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjj5\" (UniqueName: \"kubernetes.io/projected/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-kube-api-access-7bjj5\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.768273 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.768337 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-config-data\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.871587 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjj5\" (UniqueName: \"kubernetes.io/projected/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-kube-api-access-7bjj5\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.871681 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.871724 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-config-data\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.878267 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.878321 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-config-data\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.896126 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjj5\" (UniqueName: \"kubernetes.io/projected/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-kube-api-access-7bjj5\") pod \"nova-scheduler-0\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " pod="openstack/nova-scheduler-0" Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.958708 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: I1008 19:47:02.967484 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:47:02 crc kubenswrapper[4750]: W1008 19:47:02.978276 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode51cf2e9_5e27_40d9_b708_f8efd4f6447e.slice/crio-e395ee5c75fb3c927c4e642b3ac8c326978ec31623771e785923fd6f82518b7d WatchSource:0}: Error finding container e395ee5c75fb3c927c4e642b3ac8c326978ec31623771e785923fd6f82518b7d: Status 404 returned error can't find the container with id e395ee5c75fb3c927c4e642b3ac8c326978ec31623771e785923fd6f82518b7d Oct 08 19:47:02 crc kubenswrapper[4750]: W1008 19:47:02.982339 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b26f9c3_60a6_4a77_beef_6436bfcfeee7.slice/crio-f561a813331f508a21923c4b4024fe3e17b25a9e5a9473dad2cd494631b48dce WatchSource:0}: Error finding container f561a813331f508a21923c4b4024fe3e17b25a9e5a9473dad2cd494631b48dce: Status 404 returned error can't find the container with id f561a813331f508a21923c4b4024fe3e17b25a9e5a9473dad2cd494631b48dce Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.004215 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.485014 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:47:03 crc kubenswrapper[4750]: W1008 19:47:03.499138 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod131a8b9f_6bd7_4bbc_ba88_7947ec0fed82.slice/crio-3d88aaa9069f76f38ef5c586f1bdbe34135c8f080387db5bc5222ad96510a851 WatchSource:0}: Error finding container 3d88aaa9069f76f38ef5c586f1bdbe34135c8f080387db5bc5222ad96510a851: Status 404 returned error can't find the container with id 3d88aaa9069f76f38ef5c586f1bdbe34135c8f080387db5bc5222ad96510a851 Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.610951 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e51cf2e9-5e27-40d9-b708-f8efd4f6447e","Type":"ContainerStarted","Data":"024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be"} Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.611775 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e51cf2e9-5e27-40d9-b708-f8efd4f6447e","Type":"ContainerStarted","Data":"fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56"} Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.611818 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e51cf2e9-5e27-40d9-b708-f8efd4f6447e","Type":"ContainerStarted","Data":"e395ee5c75fb3c927c4e642b3ac8c326978ec31623771e785923fd6f82518b7d"} Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.616789 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b26f9c3-60a6-4a77-beef-6436bfcfeee7","Type":"ContainerStarted","Data":"f2a3a9bf13471707dace7f6ba15d27874dcd6a0cfd1dfc9bac584ae342df6cfe"} Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.616826 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b26f9c3-60a6-4a77-beef-6436bfcfeee7","Type":"ContainerStarted","Data":"17db7b7d62c5405d816a7c6605095697f528b9aa5f53c481f52b6a83c0e97d16"} Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.616838 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b26f9c3-60a6-4a77-beef-6436bfcfeee7","Type":"ContainerStarted","Data":"f561a813331f508a21923c4b4024fe3e17b25a9e5a9473dad2cd494631b48dce"} Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.619382 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82","Type":"ContainerStarted","Data":"3d88aaa9069f76f38ef5c586f1bdbe34135c8f080387db5bc5222ad96510a851"} Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.637220 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.637194536 podStartE2EDuration="2.637194536s" podCreationTimestamp="2025-10-08 19:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:47:03.637169336 +0000 UTC m=+5779.550140359" watchObservedRunningTime="2025-10-08 19:47:03.637194536 +0000 UTC m=+5779.550165549" Oct 08 19:47:03 crc kubenswrapper[4750]: I1008 19:47:03.665490 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.665463879 podStartE2EDuration="2.665463879s" podCreationTimestamp="2025-10-08 19:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:47:03.660924836 +0000 UTC m=+5779.573895839" watchObservedRunningTime="2025-10-08 19:47:03.665463879 +0000 UTC m=+5779.578434892" Oct 08 19:47:04 crc kubenswrapper[4750]: I1008 19:47:04.639058 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82","Type":"ContainerStarted","Data":"995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e"} Oct 08 19:47:04 crc kubenswrapper[4750]: I1008 19:47:04.685699 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.685653459 podStartE2EDuration="2.685653459s" podCreationTimestamp="2025-10-08 19:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:47:04.657455717 +0000 UTC m=+5780.570426830" watchObservedRunningTime="2025-10-08 19:47:04.685653459 +0000 UTC m=+5780.598624552" Oct 08 19:47:07 crc kubenswrapper[4750]: I1008 19:47:07.405223 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 19:47:07 crc kubenswrapper[4750]: I1008 19:47:07.405638 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 19:47:08 crc kubenswrapper[4750]: I1008 19:47:08.004819 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 19:47:12 crc kubenswrapper[4750]: I1008 19:47:12.391436 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 19:47:12 crc kubenswrapper[4750]: I1008 19:47:12.392366 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 19:47:12 crc kubenswrapper[4750]: I1008 19:47:12.405940 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 19:47:12 crc kubenswrapper[4750]: I1008 19:47:12.406012 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 19:47:13 crc kubenswrapper[4750]: I1008 19:47:13.004504 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 19:47:13 crc kubenswrapper[4750]: I1008 19:47:13.056783 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 19:47:13 crc kubenswrapper[4750]: I1008 19:47:13.439854 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:47:13 crc kubenswrapper[4750]: I1008 19:47:13.481077 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:47:13 crc kubenswrapper[4750]: I1008 19:47:13.521910 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.77:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:47:13 crc kubenswrapper[4750]: I1008 19:47:13.562863 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.77:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:47:13 crc kubenswrapper[4750]: I1008 19:47:13.815178 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.404196 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.405133 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.412009 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.412078 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.412113 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.414535 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.415465 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.417656 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.891369 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 19:47:22 crc kubenswrapper[4750]: I1008 19:47:22.895790 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.107529 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847586d7c9-bf5px"] Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.109375 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.129604 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847586d7c9-bf5px"] Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.167845 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-nb\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.168290 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-config\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.168359 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-sb\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.168382 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbw25\" (UniqueName: \"kubernetes.io/projected/4cd2ef1d-a118-41dc-a3df-40701656fce1-kube-api-access-nbw25\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.168406 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-dns-svc\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.271148 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-sb\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.271213 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbw25\" (UniqueName: \"kubernetes.io/projected/4cd2ef1d-a118-41dc-a3df-40701656fce1-kube-api-access-nbw25\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.271244 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-dns-svc\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.271303 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-nb\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.271393 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-config\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.272484 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-sb\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.272714 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-config\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.274043 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-nb\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.274264 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-dns-svc\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.295746 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbw25\" (UniqueName: \"kubernetes.io/projected/4cd2ef1d-a118-41dc-a3df-40701656fce1-kube-api-access-nbw25\") pod \"dnsmasq-dns-847586d7c9-bf5px\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.445721 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:23 crc kubenswrapper[4750]: I1008 19:47:23.990939 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847586d7c9-bf5px"] Oct 08 19:47:24 crc kubenswrapper[4750]: I1008 19:47:24.950651 4750 generic.go:334] "Generic (PLEG): container finished" podID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerID="b4293cf124e3869fc0d4ad744d74f4fb6d0d211c48caf921fcad856b0cdc95e1" exitCode=0 Oct 08 19:47:24 crc kubenswrapper[4750]: I1008 19:47:24.950950 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" event={"ID":"4cd2ef1d-a118-41dc-a3df-40701656fce1","Type":"ContainerDied","Data":"b4293cf124e3869fc0d4ad744d74f4fb6d0d211c48caf921fcad856b0cdc95e1"} Oct 08 19:47:24 crc kubenswrapper[4750]: I1008 19:47:24.951200 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" event={"ID":"4cd2ef1d-a118-41dc-a3df-40701656fce1","Type":"ContainerStarted","Data":"799957562c6a684644ceaa691b22eefde6268ba84a08d8aba084adaabca64444"} Oct 08 19:47:25 crc kubenswrapper[4750]: I1008 19:47:25.965594 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" event={"ID":"4cd2ef1d-a118-41dc-a3df-40701656fce1","Type":"ContainerStarted","Data":"b0cb2e1a762bf19dbfc5a739ed367c430fbf6ae8fa4dc03aa090ac33fbb7caad"} Oct 08 19:47:25 crc kubenswrapper[4750]: I1008 19:47:25.966419 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:25 crc kubenswrapper[4750]: I1008 19:47:25.990249 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" podStartSLOduration=2.99021911 podStartE2EDuration="2.99021911s" podCreationTimestamp="2025-10-08 19:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:47:25.986263061 +0000 UTC m=+5801.899234084" watchObservedRunningTime="2025-10-08 19:47:25.99021911 +0000 UTC m=+5801.903190163" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.606411 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-swf4w"] Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.610393 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.620467 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swf4w"] Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.690431 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-catalog-content\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.690522 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pczg4\" (UniqueName: \"kubernetes.io/projected/28c20807-82d0-48f8-8919-cfb646a260e1-kube-api-access-pczg4\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.690567 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-utilities\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.793941 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-catalog-content\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.794497 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pczg4\" (UniqueName: \"kubernetes.io/projected/28c20807-82d0-48f8-8919-cfb646a260e1-kube-api-access-pczg4\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.794600 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-catalog-content\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.794749 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-utilities\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.795292 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-utilities\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.816363 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pczg4\" (UniqueName: \"kubernetes.io/projected/28c20807-82d0-48f8-8919-cfb646a260e1-kube-api-access-pczg4\") pod \"community-operators-swf4w\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:27 crc kubenswrapper[4750]: I1008 19:47:27.934286 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:28 crc kubenswrapper[4750]: I1008 19:47:28.521403 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swf4w"] Oct 08 19:47:28 crc kubenswrapper[4750]: W1008 19:47:28.528278 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c20807_82d0_48f8_8919_cfb646a260e1.slice/crio-3da98f4a12d88dc85948f329fd31ac6dc5cea203745051d3b94da16eda20608b WatchSource:0}: Error finding container 3da98f4a12d88dc85948f329fd31ac6dc5cea203745051d3b94da16eda20608b: Status 404 returned error can't find the container with id 3da98f4a12d88dc85948f329fd31ac6dc5cea203745051d3b94da16eda20608b Oct 08 19:47:29 crc kubenswrapper[4750]: I1008 19:47:29.009568 4750 generic.go:334] "Generic (PLEG): container finished" podID="28c20807-82d0-48f8-8919-cfb646a260e1" containerID="1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636" exitCode=0 Oct 08 19:47:29 crc kubenswrapper[4750]: I1008 19:47:29.009687 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swf4w" event={"ID":"28c20807-82d0-48f8-8919-cfb646a260e1","Type":"ContainerDied","Data":"1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636"} Oct 08 19:47:29 crc kubenswrapper[4750]: I1008 19:47:29.010002 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swf4w" event={"ID":"28c20807-82d0-48f8-8919-cfb646a260e1","Type":"ContainerStarted","Data":"3da98f4a12d88dc85948f329fd31ac6dc5cea203745051d3b94da16eda20608b"} Oct 08 19:47:29 crc kubenswrapper[4750]: I1008 19:47:29.707238 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:47:29 crc kubenswrapper[4750]: I1008 19:47:29.707304 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:47:31 crc kubenswrapper[4750]: I1008 19:47:31.045857 4750 generic.go:334] "Generic (PLEG): container finished" podID="28c20807-82d0-48f8-8919-cfb646a260e1" containerID="ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd" exitCode=0 Oct 08 19:47:31 crc kubenswrapper[4750]: I1008 19:47:31.046365 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swf4w" event={"ID":"28c20807-82d0-48f8-8919-cfb646a260e1","Type":"ContainerDied","Data":"ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd"} Oct 08 19:47:32 crc kubenswrapper[4750]: I1008 19:47:32.062658 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swf4w" event={"ID":"28c20807-82d0-48f8-8919-cfb646a260e1","Type":"ContainerStarted","Data":"d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e"} Oct 08 19:47:32 crc kubenswrapper[4750]: I1008 19:47:32.086454 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-swf4w" podStartSLOduration=2.641116727 podStartE2EDuration="5.086427595s" podCreationTimestamp="2025-10-08 19:47:27 +0000 UTC" firstStartedPulling="2025-10-08 19:47:29.012849043 +0000 UTC m=+5804.925820056" lastFinishedPulling="2025-10-08 19:47:31.458159911 +0000 UTC m=+5807.371130924" observedRunningTime="2025-10-08 19:47:32.085519362 +0000 UTC m=+5807.998490385" watchObservedRunningTime="2025-10-08 19:47:32.086427595 +0000 UTC m=+5807.999398608" Oct 08 19:47:33 crc kubenswrapper[4750]: I1008 19:47:33.447799 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:47:33 crc kubenswrapper[4750]: I1008 19:47:33.526644 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864fd79b75-qq86v"] Oct 08 19:47:33 crc kubenswrapper[4750]: I1008 19:47:33.526958 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" podUID="11f2aae2-99f9-43da-b706-e000f1414558" containerName="dnsmasq-dns" containerID="cri-o://67d87465196007bd22f34da4339127e60e1abd2ec687ebf984f89098e900d038" gracePeriod=10 Oct 08 19:47:33 crc kubenswrapper[4750]: I1008 19:47:33.982341 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zpfw6"] Oct 08 19:47:33 crc kubenswrapper[4750]: I1008 19:47:33.987087 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.009274 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpfw6"] Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.057766 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-utilities\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.057817 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-catalog-content\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.057881 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rdbq\" (UniqueName: \"kubernetes.io/projected/bbe99043-6818-4d72-af07-c8e1c379d505-kube-api-access-9rdbq\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.111320 4750 generic.go:334] "Generic (PLEG): container finished" podID="11f2aae2-99f9-43da-b706-e000f1414558" containerID="67d87465196007bd22f34da4339127e60e1abd2ec687ebf984f89098e900d038" exitCode=0 Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.111372 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" event={"ID":"11f2aae2-99f9-43da-b706-e000f1414558","Type":"ContainerDied","Data":"67d87465196007bd22f34da4339127e60e1abd2ec687ebf984f89098e900d038"} Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.111404 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" event={"ID":"11f2aae2-99f9-43da-b706-e000f1414558","Type":"ContainerDied","Data":"eb2150661a24702de7e2c7a20de2fc1c6d18be853e17a514797082508ea2a286"} Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.111422 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb2150661a24702de7e2c7a20de2fc1c6d18be853e17a514797082508ea2a286" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.158008 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.159210 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-utilities\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.159249 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-catalog-content\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.159313 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rdbq\" (UniqueName: \"kubernetes.io/projected/bbe99043-6818-4d72-af07-c8e1c379d505-kube-api-access-9rdbq\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.160024 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-utilities\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.160082 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-catalog-content\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.180429 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rdbq\" (UniqueName: \"kubernetes.io/projected/bbe99043-6818-4d72-af07-c8e1c379d505-kube-api-access-9rdbq\") pod \"redhat-operators-zpfw6\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.261009 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-nb\") pod \"11f2aae2-99f9-43da-b706-e000f1414558\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.261527 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-config\") pod \"11f2aae2-99f9-43da-b706-e000f1414558\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.261581 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2zjc\" (UniqueName: \"kubernetes.io/projected/11f2aae2-99f9-43da-b706-e000f1414558-kube-api-access-q2zjc\") pod \"11f2aae2-99f9-43da-b706-e000f1414558\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.261676 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-dns-svc\") pod \"11f2aae2-99f9-43da-b706-e000f1414558\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.261762 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-sb\") pod \"11f2aae2-99f9-43da-b706-e000f1414558\" (UID: \"11f2aae2-99f9-43da-b706-e000f1414558\") " Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.270306 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f2aae2-99f9-43da-b706-e000f1414558-kube-api-access-q2zjc" (OuterVolumeSpecName: "kube-api-access-q2zjc") pod "11f2aae2-99f9-43da-b706-e000f1414558" (UID: "11f2aae2-99f9-43da-b706-e000f1414558"). InnerVolumeSpecName "kube-api-access-q2zjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.312541 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11f2aae2-99f9-43da-b706-e000f1414558" (UID: "11f2aae2-99f9-43da-b706-e000f1414558"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.318764 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11f2aae2-99f9-43da-b706-e000f1414558" (UID: "11f2aae2-99f9-43da-b706-e000f1414558"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.318838 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11f2aae2-99f9-43da-b706-e000f1414558" (UID: "11f2aae2-99f9-43da-b706-e000f1414558"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.323382 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-config" (OuterVolumeSpecName: "config") pod "11f2aae2-99f9-43da-b706-e000f1414558" (UID: "11f2aae2-99f9-43da-b706-e000f1414558"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.330741 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.364097 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.364143 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2zjc\" (UniqueName: \"kubernetes.io/projected/11f2aae2-99f9-43da-b706-e000f1414558-kube-api-access-q2zjc\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.364157 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.364167 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.364178 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11f2aae2-99f9-43da-b706-e000f1414558-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:34 crc kubenswrapper[4750]: I1008 19:47:34.888437 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpfw6"] Oct 08 19:47:35 crc kubenswrapper[4750]: I1008 19:47:35.128016 4750 generic.go:334] "Generic (PLEG): container finished" podID="bbe99043-6818-4d72-af07-c8e1c379d505" containerID="8de7b593e3d792162edbc96024aa28b96dbb9ee9cb425784e7078e0571910c52" exitCode=0 Oct 08 19:47:35 crc kubenswrapper[4750]: I1008 19:47:35.128086 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpfw6" event={"ID":"bbe99043-6818-4d72-af07-c8e1c379d505","Type":"ContainerDied","Data":"8de7b593e3d792162edbc96024aa28b96dbb9ee9cb425784e7078e0571910c52"} Oct 08 19:47:35 crc kubenswrapper[4750]: I1008 19:47:35.128133 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864fd79b75-qq86v" Oct 08 19:47:35 crc kubenswrapper[4750]: I1008 19:47:35.128151 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpfw6" event={"ID":"bbe99043-6818-4d72-af07-c8e1c379d505","Type":"ContainerStarted","Data":"1ee42dc247e539b51082b2f4db4b9b2a973a6f94ba6fc5d04832576204c9c2f5"} Oct 08 19:47:35 crc kubenswrapper[4750]: I1008 19:47:35.189323 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864fd79b75-qq86v"] Oct 08 19:47:35 crc kubenswrapper[4750]: I1008 19:47:35.197453 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864fd79b75-qq86v"] Oct 08 19:47:36 crc kubenswrapper[4750]: I1008 19:47:36.755761 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f2aae2-99f9-43da-b706-e000f1414558" path="/var/lib/kubelet/pods/11f2aae2-99f9-43da-b706-e000f1414558/volumes" Oct 08 19:47:37 crc kubenswrapper[4750]: I1008 19:47:37.154364 4750 generic.go:334] "Generic (PLEG): container finished" podID="bbe99043-6818-4d72-af07-c8e1c379d505" containerID="36b19e2bed68ca2a167ee5df1f3182e27b05bd7fee8823879869514d6dc684be" exitCode=0 Oct 08 19:47:37 crc kubenswrapper[4750]: I1008 19:47:37.154427 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpfw6" event={"ID":"bbe99043-6818-4d72-af07-c8e1c379d505","Type":"ContainerDied","Data":"36b19e2bed68ca2a167ee5df1f3182e27b05bd7fee8823879869514d6dc684be"} Oct 08 19:47:37 crc kubenswrapper[4750]: I1008 19:47:37.934786 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:37 crc kubenswrapper[4750]: I1008 19:47:37.934870 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:37 crc kubenswrapper[4750]: I1008 19:47:37.999470 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:38 crc kubenswrapper[4750]: I1008 19:47:38.230593 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:39 crc kubenswrapper[4750]: I1008 19:47:39.165715 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swf4w"] Oct 08 19:47:39 crc kubenswrapper[4750]: I1008 19:47:39.188057 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpfw6" event={"ID":"bbe99043-6818-4d72-af07-c8e1c379d505","Type":"ContainerStarted","Data":"c174b00ab960afb24e175cce4a234e9ee46aaac92d2ba56b69019b0fd528a624"} Oct 08 19:47:39 crc kubenswrapper[4750]: I1008 19:47:39.221511 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zpfw6" podStartSLOduration=2.862874535 podStartE2EDuration="6.221479883s" podCreationTimestamp="2025-10-08 19:47:33 +0000 UTC" firstStartedPulling="2025-10-08 19:47:35.130304717 +0000 UTC m=+5811.043275740" lastFinishedPulling="2025-10-08 19:47:38.488910075 +0000 UTC m=+5814.401881088" observedRunningTime="2025-10-08 19:47:39.21859521 +0000 UTC m=+5815.131566263" watchObservedRunningTime="2025-10-08 19:47:39.221479883 +0000 UTC m=+5815.134450926" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.198392 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-swf4w" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" containerName="registry-server" containerID="cri-o://d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e" gracePeriod=2 Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.652135 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6dlz8"] Oct 08 19:47:40 crc kubenswrapper[4750]: E1008 19:47:40.652907 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f2aae2-99f9-43da-b706-e000f1414558" containerName="dnsmasq-dns" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.652927 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f2aae2-99f9-43da-b706-e000f1414558" containerName="dnsmasq-dns" Oct 08 19:47:40 crc kubenswrapper[4750]: E1008 19:47:40.652947 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f2aae2-99f9-43da-b706-e000f1414558" containerName="init" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.652955 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f2aae2-99f9-43da-b706-e000f1414558" containerName="init" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.653156 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f2aae2-99f9-43da-b706-e000f1414558" containerName="dnsmasq-dns" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.665225 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6dlz8" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.692410 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6dlz8"] Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.723213 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j884w\" (UniqueName: \"kubernetes.io/projected/e30d5ba5-d856-47b4-8b0d-aeb153eee24d-kube-api-access-j884w\") pod \"cinder-db-create-6dlz8\" (UID: \"e30d5ba5-d856-47b4-8b0d-aeb153eee24d\") " pod="openstack/cinder-db-create-6dlz8" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.825929 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j884w\" (UniqueName: \"kubernetes.io/projected/e30d5ba5-d856-47b4-8b0d-aeb153eee24d-kube-api-access-j884w\") pod \"cinder-db-create-6dlz8\" (UID: \"e30d5ba5-d856-47b4-8b0d-aeb153eee24d\") " pod="openstack/cinder-db-create-6dlz8" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.839331 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:40 crc kubenswrapper[4750]: I1008 19:47:40.875815 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j884w\" (UniqueName: \"kubernetes.io/projected/e30d5ba5-d856-47b4-8b0d-aeb153eee24d-kube-api-access-j884w\") pod \"cinder-db-create-6dlz8\" (UID: \"e30d5ba5-d856-47b4-8b0d-aeb153eee24d\") " pod="openstack/cinder-db-create-6dlz8" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.029426 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pczg4\" (UniqueName: \"kubernetes.io/projected/28c20807-82d0-48f8-8919-cfb646a260e1-kube-api-access-pczg4\") pod \"28c20807-82d0-48f8-8919-cfb646a260e1\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.029584 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-catalog-content\") pod \"28c20807-82d0-48f8-8919-cfb646a260e1\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.029672 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-utilities\") pod \"28c20807-82d0-48f8-8919-cfb646a260e1\" (UID: \"28c20807-82d0-48f8-8919-cfb646a260e1\") " Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.033621 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-utilities" (OuterVolumeSpecName: "utilities") pod "28c20807-82d0-48f8-8919-cfb646a260e1" (UID: "28c20807-82d0-48f8-8919-cfb646a260e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.034142 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c20807-82d0-48f8-8919-cfb646a260e1-kube-api-access-pczg4" (OuterVolumeSpecName: "kube-api-access-pczg4") pod "28c20807-82d0-48f8-8919-cfb646a260e1" (UID: "28c20807-82d0-48f8-8919-cfb646a260e1"). InnerVolumeSpecName "kube-api-access-pczg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.083527 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28c20807-82d0-48f8-8919-cfb646a260e1" (UID: "28c20807-82d0-48f8-8919-cfb646a260e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.132516 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.132572 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pczg4\" (UniqueName: \"kubernetes.io/projected/28c20807-82d0-48f8-8919-cfb646a260e1-kube-api-access-pczg4\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.132586 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28c20807-82d0-48f8-8919-cfb646a260e1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.132718 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6dlz8" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.213280 4750 generic.go:334] "Generic (PLEG): container finished" podID="28c20807-82d0-48f8-8919-cfb646a260e1" containerID="d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e" exitCode=0 Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.213335 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swf4w" event={"ID":"28c20807-82d0-48f8-8919-cfb646a260e1","Type":"ContainerDied","Data":"d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e"} Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.213371 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swf4w" event={"ID":"28c20807-82d0-48f8-8919-cfb646a260e1","Type":"ContainerDied","Data":"3da98f4a12d88dc85948f329fd31ac6dc5cea203745051d3b94da16eda20608b"} Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.213392 4750 scope.go:117] "RemoveContainer" containerID="d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.213407 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swf4w" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.251011 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swf4w"] Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.259937 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-swf4w"] Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.266012 4750 scope.go:117] "RemoveContainer" containerID="ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.310616 4750 scope.go:117] "RemoveContainer" containerID="1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.443367 4750 scope.go:117] "RemoveContainer" containerID="d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e" Oct 08 19:47:41 crc kubenswrapper[4750]: E1008 19:47:41.444021 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e\": container with ID starting with d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e not found: ID does not exist" containerID="d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.444061 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e"} err="failed to get container status \"d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e\": rpc error: code = NotFound desc = could not find container \"d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e\": container with ID starting with d7465beee16bd9bdd58dd892c1f0bda244310b9d2f39c8bdd89a0c2d8220694e not found: ID does not exist" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.444091 4750 scope.go:117] "RemoveContainer" containerID="ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd" Oct 08 19:47:41 crc kubenswrapper[4750]: E1008 19:47:41.444328 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd\": container with ID starting with ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd not found: ID does not exist" containerID="ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.444355 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd"} err="failed to get container status \"ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd\": rpc error: code = NotFound desc = could not find container \"ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd\": container with ID starting with ffcf7afa2180be9e93eed8ff83139c498a9ea9a80a35d6e0bf1d5a8fa47153fd not found: ID does not exist" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.444375 4750 scope.go:117] "RemoveContainer" containerID="1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636" Oct 08 19:47:41 crc kubenswrapper[4750]: E1008 19:47:41.445717 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636\": container with ID starting with 1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636 not found: ID does not exist" containerID="1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.446952 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636"} err="failed to get container status \"1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636\": rpc error: code = NotFound desc = could not find container \"1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636\": container with ID starting with 1cb3d19e92ba3dc152e1b31d90eaee8ef0e2d52ad90a5d47bcde58244b329636 not found: ID does not exist" Oct 08 19:47:41 crc kubenswrapper[4750]: I1008 19:47:41.507473 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6dlz8"] Oct 08 19:47:42 crc kubenswrapper[4750]: I1008 19:47:42.225834 4750 generic.go:334] "Generic (PLEG): container finished" podID="e30d5ba5-d856-47b4-8b0d-aeb153eee24d" containerID="7f0511dc6da9cfa214336414c466dbad8936913782bd5527a4d15e1f0bd855b4" exitCode=0 Oct 08 19:47:42 crc kubenswrapper[4750]: I1008 19:47:42.225926 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6dlz8" event={"ID":"e30d5ba5-d856-47b4-8b0d-aeb153eee24d","Type":"ContainerDied","Data":"7f0511dc6da9cfa214336414c466dbad8936913782bd5527a4d15e1f0bd855b4"} Oct 08 19:47:42 crc kubenswrapper[4750]: I1008 19:47:42.225961 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6dlz8" event={"ID":"e30d5ba5-d856-47b4-8b0d-aeb153eee24d","Type":"ContainerStarted","Data":"ab642664544c5dec877901985abf2b8f5333bdbb5998e00dc1330a49f49d5cbc"} Oct 08 19:47:42 crc kubenswrapper[4750]: I1008 19:47:42.752431 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" path="/var/lib/kubelet/pods/28c20807-82d0-48f8-8919-cfb646a260e1/volumes" Oct 08 19:47:43 crc kubenswrapper[4750]: I1008 19:47:43.646163 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6dlz8" Oct 08 19:47:43 crc kubenswrapper[4750]: I1008 19:47:43.795745 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j884w\" (UniqueName: \"kubernetes.io/projected/e30d5ba5-d856-47b4-8b0d-aeb153eee24d-kube-api-access-j884w\") pod \"e30d5ba5-d856-47b4-8b0d-aeb153eee24d\" (UID: \"e30d5ba5-d856-47b4-8b0d-aeb153eee24d\") " Oct 08 19:47:43 crc kubenswrapper[4750]: I1008 19:47:43.803011 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30d5ba5-d856-47b4-8b0d-aeb153eee24d-kube-api-access-j884w" (OuterVolumeSpecName: "kube-api-access-j884w") pod "e30d5ba5-d856-47b4-8b0d-aeb153eee24d" (UID: "e30d5ba5-d856-47b4-8b0d-aeb153eee24d"). InnerVolumeSpecName "kube-api-access-j884w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:47:43 crc kubenswrapper[4750]: I1008 19:47:43.898500 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j884w\" (UniqueName: \"kubernetes.io/projected/e30d5ba5-d856-47b4-8b0d-aeb153eee24d-kube-api-access-j884w\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:44 crc kubenswrapper[4750]: I1008 19:47:44.248642 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6dlz8" event={"ID":"e30d5ba5-d856-47b4-8b0d-aeb153eee24d","Type":"ContainerDied","Data":"ab642664544c5dec877901985abf2b8f5333bdbb5998e00dc1330a49f49d5cbc"} Oct 08 19:47:44 crc kubenswrapper[4750]: I1008 19:47:44.248691 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab642664544c5dec877901985abf2b8f5333bdbb5998e00dc1330a49f49d5cbc" Oct 08 19:47:44 crc kubenswrapper[4750]: I1008 19:47:44.248713 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6dlz8" Oct 08 19:47:44 crc kubenswrapper[4750]: I1008 19:47:44.332066 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:44 crc kubenswrapper[4750]: I1008 19:47:44.332143 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:44 crc kubenswrapper[4750]: I1008 19:47:44.386163 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:45 crc kubenswrapper[4750]: I1008 19:47:45.339707 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:45 crc kubenswrapper[4750]: I1008 19:47:45.399634 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zpfw6"] Oct 08 19:47:47 crc kubenswrapper[4750]: I1008 19:47:47.292687 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zpfw6" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" containerName="registry-server" containerID="cri-o://c174b00ab960afb24e175cce4a234e9ee46aaac92d2ba56b69019b0fd528a624" gracePeriod=2 Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.321167 4750 generic.go:334] "Generic (PLEG): container finished" podID="bbe99043-6818-4d72-af07-c8e1c379d505" containerID="c174b00ab960afb24e175cce4a234e9ee46aaac92d2ba56b69019b0fd528a624" exitCode=0 Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.321380 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpfw6" event={"ID":"bbe99043-6818-4d72-af07-c8e1c379d505","Type":"ContainerDied","Data":"c174b00ab960afb24e175cce4a234e9ee46aaac92d2ba56b69019b0fd528a624"} Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.672752 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.748664 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-catalog-content\") pod \"bbe99043-6818-4d72-af07-c8e1c379d505\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.748863 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-utilities\") pod \"bbe99043-6818-4d72-af07-c8e1c379d505\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.749117 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rdbq\" (UniqueName: \"kubernetes.io/projected/bbe99043-6818-4d72-af07-c8e1c379d505-kube-api-access-9rdbq\") pod \"bbe99043-6818-4d72-af07-c8e1c379d505\" (UID: \"bbe99043-6818-4d72-af07-c8e1c379d505\") " Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.750976 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-utilities" (OuterVolumeSpecName: "utilities") pod "bbe99043-6818-4d72-af07-c8e1c379d505" (UID: "bbe99043-6818-4d72-af07-c8e1c379d505"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.759748 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe99043-6818-4d72-af07-c8e1c379d505-kube-api-access-9rdbq" (OuterVolumeSpecName: "kube-api-access-9rdbq") pod "bbe99043-6818-4d72-af07-c8e1c379d505" (UID: "bbe99043-6818-4d72-af07-c8e1c379d505"). InnerVolumeSpecName "kube-api-access-9rdbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.852476 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rdbq\" (UniqueName: \"kubernetes.io/projected/bbe99043-6818-4d72-af07-c8e1c379d505-kube-api-access-9rdbq\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.852523 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.868477 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbe99043-6818-4d72-af07-c8e1c379d505" (UID: "bbe99043-6818-4d72-af07-c8e1c379d505"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:47:49 crc kubenswrapper[4750]: I1008 19:47:49.954611 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe99043-6818-4d72-af07-c8e1c379d505-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.343794 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpfw6" event={"ID":"bbe99043-6818-4d72-af07-c8e1c379d505","Type":"ContainerDied","Data":"1ee42dc247e539b51082b2f4db4b9b2a973a6f94ba6fc5d04832576204c9c2f5"} Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.343877 4750 scope.go:117] "RemoveContainer" containerID="c174b00ab960afb24e175cce4a234e9ee46aaac92d2ba56b69019b0fd528a624" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.343985 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpfw6" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.374444 4750 scope.go:117] "RemoveContainer" containerID="36b19e2bed68ca2a167ee5df1f3182e27b05bd7fee8823879869514d6dc684be" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.398622 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zpfw6"] Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.406696 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zpfw6"] Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.414199 4750 scope.go:117] "RemoveContainer" containerID="8de7b593e3d792162edbc96024aa28b96dbb9ee9cb425784e7078e0571910c52" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.640577 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a9d3-account-create-jhghv"] Oct 08 19:47:50 crc kubenswrapper[4750]: E1008 19:47:50.641247 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30d5ba5-d856-47b4-8b0d-aeb153eee24d" containerName="mariadb-database-create" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641272 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30d5ba5-d856-47b4-8b0d-aeb153eee24d" containerName="mariadb-database-create" Oct 08 19:47:50 crc kubenswrapper[4750]: E1008 19:47:50.641290 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" containerName="registry-server" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641298 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" containerName="registry-server" Oct 08 19:47:50 crc kubenswrapper[4750]: E1008 19:47:50.641326 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" containerName="registry-server" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641334 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" containerName="registry-server" Oct 08 19:47:50 crc kubenswrapper[4750]: E1008 19:47:50.641360 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" containerName="extract-utilities" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641368 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" containerName="extract-utilities" Oct 08 19:47:50 crc kubenswrapper[4750]: E1008 19:47:50.641383 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" containerName="extract-content" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641390 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" containerName="extract-content" Oct 08 19:47:50 crc kubenswrapper[4750]: E1008 19:47:50.641413 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" containerName="extract-content" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641422 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" containerName="extract-content" Oct 08 19:47:50 crc kubenswrapper[4750]: E1008 19:47:50.641443 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" containerName="extract-utilities" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641450 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" containerName="extract-utilities" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641755 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="28c20807-82d0-48f8-8919-cfb646a260e1" containerName="registry-server" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641773 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30d5ba5-d856-47b4-8b0d-aeb153eee24d" containerName="mariadb-database-create" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.641792 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" containerName="registry-server" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.642725 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a9d3-account-create-jhghv" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.645111 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.654450 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a9d3-account-create-jhghv"] Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.671165 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gxcc\" (UniqueName: \"kubernetes.io/projected/5252b4f6-3b9c-471f-b919-ac87889a95cb-kube-api-access-8gxcc\") pod \"cinder-a9d3-account-create-jhghv\" (UID: \"5252b4f6-3b9c-471f-b919-ac87889a95cb\") " pod="openstack/cinder-a9d3-account-create-jhghv" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.748320 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe99043-6818-4d72-af07-c8e1c379d505" path="/var/lib/kubelet/pods/bbe99043-6818-4d72-af07-c8e1c379d505/volumes" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.773055 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gxcc\" (UniqueName: \"kubernetes.io/projected/5252b4f6-3b9c-471f-b919-ac87889a95cb-kube-api-access-8gxcc\") pod \"cinder-a9d3-account-create-jhghv\" (UID: \"5252b4f6-3b9c-471f-b919-ac87889a95cb\") " pod="openstack/cinder-a9d3-account-create-jhghv" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.802224 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gxcc\" (UniqueName: \"kubernetes.io/projected/5252b4f6-3b9c-471f-b919-ac87889a95cb-kube-api-access-8gxcc\") pod \"cinder-a9d3-account-create-jhghv\" (UID: \"5252b4f6-3b9c-471f-b919-ac87889a95cb\") " pod="openstack/cinder-a9d3-account-create-jhghv" Oct 08 19:47:50 crc kubenswrapper[4750]: I1008 19:47:50.974855 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a9d3-account-create-jhghv" Oct 08 19:47:51 crc kubenswrapper[4750]: I1008 19:47:51.514184 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a9d3-account-create-jhghv"] Oct 08 19:47:51 crc kubenswrapper[4750]: W1008 19:47:51.519296 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5252b4f6_3b9c_471f_b919_ac87889a95cb.slice/crio-de7907173d90c1f6a4123a64a5e87f08ce59e9fd42a575469550941088a51921 WatchSource:0}: Error finding container de7907173d90c1f6a4123a64a5e87f08ce59e9fd42a575469550941088a51921: Status 404 returned error can't find the container with id de7907173d90c1f6a4123a64a5e87f08ce59e9fd42a575469550941088a51921 Oct 08 19:47:52 crc kubenswrapper[4750]: I1008 19:47:52.419258 4750 generic.go:334] "Generic (PLEG): container finished" podID="5252b4f6-3b9c-471f-b919-ac87889a95cb" containerID="666bcd48683c2fa3cc15aee36c899e12b20a9c3914b7cce1af304b608a27ad58" exitCode=0 Oct 08 19:47:52 crc kubenswrapper[4750]: I1008 19:47:52.419358 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a9d3-account-create-jhghv" event={"ID":"5252b4f6-3b9c-471f-b919-ac87889a95cb","Type":"ContainerDied","Data":"666bcd48683c2fa3cc15aee36c899e12b20a9c3914b7cce1af304b608a27ad58"} Oct 08 19:47:52 crc kubenswrapper[4750]: I1008 19:47:52.419846 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a9d3-account-create-jhghv" event={"ID":"5252b4f6-3b9c-471f-b919-ac87889a95cb","Type":"ContainerStarted","Data":"de7907173d90c1f6a4123a64a5e87f08ce59e9fd42a575469550941088a51921"} Oct 08 19:47:53 crc kubenswrapper[4750]: I1008 19:47:53.873148 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a9d3-account-create-jhghv" Oct 08 19:47:54 crc kubenswrapper[4750]: I1008 19:47:54.049998 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gxcc\" (UniqueName: \"kubernetes.io/projected/5252b4f6-3b9c-471f-b919-ac87889a95cb-kube-api-access-8gxcc\") pod \"5252b4f6-3b9c-471f-b919-ac87889a95cb\" (UID: \"5252b4f6-3b9c-471f-b919-ac87889a95cb\") " Oct 08 19:47:54 crc kubenswrapper[4750]: I1008 19:47:54.059130 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5252b4f6-3b9c-471f-b919-ac87889a95cb-kube-api-access-8gxcc" (OuterVolumeSpecName: "kube-api-access-8gxcc") pod "5252b4f6-3b9c-471f-b919-ac87889a95cb" (UID: "5252b4f6-3b9c-471f-b919-ac87889a95cb"). InnerVolumeSpecName "kube-api-access-8gxcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:47:54 crc kubenswrapper[4750]: I1008 19:47:54.152988 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gxcc\" (UniqueName: \"kubernetes.io/projected/5252b4f6-3b9c-471f-b919-ac87889a95cb-kube-api-access-8gxcc\") on node \"crc\" DevicePath \"\"" Oct 08 19:47:54 crc kubenswrapper[4750]: I1008 19:47:54.443676 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a9d3-account-create-jhghv" Oct 08 19:47:54 crc kubenswrapper[4750]: I1008 19:47:54.443709 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a9d3-account-create-jhghv" event={"ID":"5252b4f6-3b9c-471f-b919-ac87889a95cb","Type":"ContainerDied","Data":"de7907173d90c1f6a4123a64a5e87f08ce59e9fd42a575469550941088a51921"} Oct 08 19:47:54 crc kubenswrapper[4750]: I1008 19:47:54.443785 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de7907173d90c1f6a4123a64a5e87f08ce59e9fd42a575469550941088a51921" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.820876 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mtvkc"] Oct 08 19:47:55 crc kubenswrapper[4750]: E1008 19:47:55.821593 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5252b4f6-3b9c-471f-b919-ac87889a95cb" containerName="mariadb-account-create" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.821620 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5252b4f6-3b9c-471f-b919-ac87889a95cb" containerName="mariadb-account-create" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.821974 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5252b4f6-3b9c-471f-b919-ac87889a95cb" containerName="mariadb-account-create" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.823035 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.827506 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f4rw5" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.827853 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.829438 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.841675 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mtvkc"] Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.994258 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-config-data\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.994317 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-scripts\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.994776 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-combined-ca-bundle\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.995144 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8ck5\" (UniqueName: \"kubernetes.io/projected/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-kube-api-access-p8ck5\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.995318 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-db-sync-config-data\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:55 crc kubenswrapper[4750]: I1008 19:47:55.995825 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-etc-machine-id\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.098168 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8ck5\" (UniqueName: \"kubernetes.io/projected/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-kube-api-access-p8ck5\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.098233 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-db-sync-config-data\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.098301 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-etc-machine-id\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.098336 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-config-data\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.098357 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-scripts\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.098450 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-combined-ca-bundle\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.099451 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-etc-machine-id\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.104343 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-scripts\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.104761 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-db-sync-config-data\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.106510 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-config-data\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.109483 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-combined-ca-bundle\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.131693 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8ck5\" (UniqueName: \"kubernetes.io/projected/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-kube-api-access-p8ck5\") pod \"cinder-db-sync-mtvkc\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.190051 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:47:56 crc kubenswrapper[4750]: I1008 19:47:56.491783 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mtvkc"] Oct 08 19:47:57 crc kubenswrapper[4750]: I1008 19:47:57.491462 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mtvkc" event={"ID":"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac","Type":"ContainerStarted","Data":"1c2c66d67b4aa603efba88fbdf492bf052470c6c732a4ee7a4806bdcfb5c2078"} Oct 08 19:47:57 crc kubenswrapper[4750]: I1008 19:47:57.491858 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mtvkc" event={"ID":"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac","Type":"ContainerStarted","Data":"1a886f0afc3230326d611c68ee5be3ed2f87e2f7d23f5654c5e29702902cfca5"} Oct 08 19:47:57 crc kubenswrapper[4750]: I1008 19:47:57.522642 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mtvkc" podStartSLOduration=2.522617479 podStartE2EDuration="2.522617479s" podCreationTimestamp="2025-10-08 19:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:47:57.522363313 +0000 UTC m=+5833.435334406" watchObservedRunningTime="2025-10-08 19:47:57.522617479 +0000 UTC m=+5833.435588502" Oct 08 19:47:59 crc kubenswrapper[4750]: I1008 19:47:59.706737 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:47:59 crc kubenswrapper[4750]: I1008 19:47:59.707094 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:47:59 crc kubenswrapper[4750]: I1008 19:47:59.707144 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:47:59 crc kubenswrapper[4750]: I1008 19:47:59.707956 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:47:59 crc kubenswrapper[4750]: I1008 19:47:59.708013 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" gracePeriod=600 Oct 08 19:47:59 crc kubenswrapper[4750]: E1008 19:47:59.846242 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:48:00 crc kubenswrapper[4750]: I1008 19:48:00.551646 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" exitCode=0 Oct 08 19:48:00 crc kubenswrapper[4750]: I1008 19:48:00.551722 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76"} Oct 08 19:48:00 crc kubenswrapper[4750]: I1008 19:48:00.551771 4750 scope.go:117] "RemoveContainer" containerID="1499c5346a2a6057a03d90e92d9fde974f96055257fa4f56cf44bb72d0e5bf8e" Oct 08 19:48:00 crc kubenswrapper[4750]: I1008 19:48:00.553053 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:48:00 crc kubenswrapper[4750]: E1008 19:48:00.553544 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:48:00 crc kubenswrapper[4750]: I1008 19:48:00.553711 4750 generic.go:334] "Generic (PLEG): container finished" podID="3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" containerID="1c2c66d67b4aa603efba88fbdf492bf052470c6c732a4ee7a4806bdcfb5c2078" exitCode=0 Oct 08 19:48:00 crc kubenswrapper[4750]: I1008 19:48:00.553744 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mtvkc" event={"ID":"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac","Type":"ContainerDied","Data":"1c2c66d67b4aa603efba88fbdf492bf052470c6c732a4ee7a4806bdcfb5c2078"} Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.894344 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.969539 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-etc-machine-id\") pod \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.969686 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-config-data\") pod \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.969737 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-db-sync-config-data\") pod \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.969782 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-scripts\") pod \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.969803 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-combined-ca-bundle\") pod \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.969818 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" (UID: "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.969881 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8ck5\" (UniqueName: \"kubernetes.io/projected/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-kube-api-access-p8ck5\") pod \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\" (UID: \"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac\") " Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.971295 4750 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.977045 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" (UID: "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.977087 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-scripts" (OuterVolumeSpecName: "scripts") pod "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" (UID: "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:01 crc kubenswrapper[4750]: I1008 19:48:01.981896 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-kube-api-access-p8ck5" (OuterVolumeSpecName: "kube-api-access-p8ck5") pod "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" (UID: "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac"). InnerVolumeSpecName "kube-api-access-p8ck5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.006036 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" (UID: "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.033952 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-config-data" (OuterVolumeSpecName: "config-data") pod "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" (UID: "3aeb13a0-48b6-4a84-998d-3fb8a64d84ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.073505 4750 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.073576 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.073595 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.073608 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8ck5\" (UniqueName: \"kubernetes.io/projected/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-kube-api-access-p8ck5\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.073620 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.582629 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mtvkc" event={"ID":"3aeb13a0-48b6-4a84-998d-3fb8a64d84ac","Type":"ContainerDied","Data":"1a886f0afc3230326d611c68ee5be3ed2f87e2f7d23f5654c5e29702902cfca5"} Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.582691 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a886f0afc3230326d611c68ee5be3ed2f87e2f7d23f5654c5e29702902cfca5" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.582693 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mtvkc" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.974908 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86f86b88dc-8vbxj"] Oct 08 19:48:02 crc kubenswrapper[4750]: E1008 19:48:02.975382 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" containerName="cinder-db-sync" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.975403 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" containerName="cinder-db-sync" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.975648 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" containerName="cinder-db-sync" Oct 08 19:48:02 crc kubenswrapper[4750]: I1008 19:48:02.977066 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.003468 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86f86b88dc-8vbxj"] Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.095306 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-dns-svc\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.095542 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-config\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.095725 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.095810 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ldlk\" (UniqueName: \"kubernetes.io/projected/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-kube-api-access-7ldlk\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.095905 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.198621 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-dns-svc\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.198723 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-config\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.198792 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.198838 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ldlk\" (UniqueName: \"kubernetes.io/projected/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-kube-api-access-7ldlk\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.198884 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.199970 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.199970 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-dns-svc\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.200236 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.200237 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-config\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.226827 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ldlk\" (UniqueName: \"kubernetes.io/projected/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-kube-api-access-7ldlk\") pod \"dnsmasq-dns-86f86b88dc-8vbxj\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.291658 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.293417 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.296068 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.296128 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.300034 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.306234 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.306375 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f4rw5" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.306439 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.402664 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.403072 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkvfv\" (UniqueName: \"kubernetes.io/projected/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-kube-api-access-gkvfv\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.403183 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.403305 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.403401 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.403541 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-scripts\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.403738 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-logs\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.505962 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.506031 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.506103 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-scripts\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.506135 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-logs\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.506177 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.506233 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkvfv\" (UniqueName: \"kubernetes.io/projected/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-kube-api-access-gkvfv\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.506267 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.506381 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.507919 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-logs\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.514437 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.516467 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.529520 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-scripts\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.533588 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkvfv\" (UniqueName: \"kubernetes.io/projected/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-kube-api-access-gkvfv\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.534784 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data\") pod \"cinder-api-0\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.616769 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 19:48:03 crc kubenswrapper[4750]: I1008 19:48:03.637281 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86f86b88dc-8vbxj"] Oct 08 19:48:04 crc kubenswrapper[4750]: I1008 19:48:04.155675 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:04 crc kubenswrapper[4750]: I1008 19:48:04.630304 4750 generic.go:334] "Generic (PLEG): container finished" podID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" containerID="ccc362b1bd4253f2596be75db638b5fe771ce9a1236308de3cdfa210f77c4669" exitCode=0 Oct 08 19:48:04 crc kubenswrapper[4750]: I1008 19:48:04.630388 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" event={"ID":"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6","Type":"ContainerDied","Data":"ccc362b1bd4253f2596be75db638b5fe771ce9a1236308de3cdfa210f77c4669"} Oct 08 19:48:04 crc kubenswrapper[4750]: I1008 19:48:04.630424 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" event={"ID":"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6","Type":"ContainerStarted","Data":"d41ea90c5355c4fb5fa742310a88b11868b56351d33f85eee7f04702ae5bd9dc"} Oct 08 19:48:04 crc kubenswrapper[4750]: I1008 19:48:04.633945 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f","Type":"ContainerStarted","Data":"df538dcf37472e9c76eba5d9486784f08077935fbfb4aafb50a3bb2c2e55a2bd"} Oct 08 19:48:05 crc kubenswrapper[4750]: I1008 19:48:05.646725 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" event={"ID":"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6","Type":"ContainerStarted","Data":"c6a3294ff68af21a75f4f00b56dcc5ba00d7d9a217bcd8728813c58e619f41ce"} Oct 08 19:48:05 crc kubenswrapper[4750]: I1008 19:48:05.647611 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:05 crc kubenswrapper[4750]: I1008 19:48:05.650515 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f","Type":"ContainerStarted","Data":"760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a"} Oct 08 19:48:05 crc kubenswrapper[4750]: I1008 19:48:05.650612 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f","Type":"ContainerStarted","Data":"84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973"} Oct 08 19:48:05 crc kubenswrapper[4750]: I1008 19:48:05.650767 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 19:48:05 crc kubenswrapper[4750]: I1008 19:48:05.675231 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" podStartSLOduration=3.675206131 podStartE2EDuration="3.675206131s" podCreationTimestamp="2025-10-08 19:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:05.664739521 +0000 UTC m=+5841.577710604" watchObservedRunningTime="2025-10-08 19:48:05.675206131 +0000 UTC m=+5841.588177144" Oct 08 19:48:05 crc kubenswrapper[4750]: I1008 19:48:05.686492 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.686471191 podStartE2EDuration="2.686471191s" podCreationTimestamp="2025-10-08 19:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:05.684996154 +0000 UTC m=+5841.597967187" watchObservedRunningTime="2025-10-08 19:48:05.686471191 +0000 UTC m=+5841.599442214" Oct 08 19:48:11 crc kubenswrapper[4750]: I1008 19:48:11.735497 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:48:11 crc kubenswrapper[4750]: E1008 19:48:11.736723 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.303868 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.381923 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847586d7c9-bf5px"] Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.382317 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" podUID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerName="dnsmasq-dns" containerID="cri-o://b0cb2e1a762bf19dbfc5a739ed367c430fbf6ae8fa4dc03aa090ac33fbb7caad" gracePeriod=10 Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.446743 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" podUID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.79:5353: connect: connection refused" Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.759119 4750 generic.go:334] "Generic (PLEG): container finished" podID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerID="b0cb2e1a762bf19dbfc5a739ed367c430fbf6ae8fa4dc03aa090ac33fbb7caad" exitCode=0 Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.759169 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" event={"ID":"4cd2ef1d-a118-41dc-a3df-40701656fce1","Type":"ContainerDied","Data":"b0cb2e1a762bf19dbfc5a739ed367c430fbf6ae8fa4dc03aa090ac33fbb7caad"} Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.923067 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.989501 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-config\") pod \"4cd2ef1d-a118-41dc-a3df-40701656fce1\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.989598 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-sb\") pod \"4cd2ef1d-a118-41dc-a3df-40701656fce1\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.989673 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbw25\" (UniqueName: \"kubernetes.io/projected/4cd2ef1d-a118-41dc-a3df-40701656fce1-kube-api-access-nbw25\") pod \"4cd2ef1d-a118-41dc-a3df-40701656fce1\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.989752 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-dns-svc\") pod \"4cd2ef1d-a118-41dc-a3df-40701656fce1\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " Oct 08 19:48:13 crc kubenswrapper[4750]: I1008 19:48:13.989936 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-nb\") pod \"4cd2ef1d-a118-41dc-a3df-40701656fce1\" (UID: \"4cd2ef1d-a118-41dc-a3df-40701656fce1\") " Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.000780 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd2ef1d-a118-41dc-a3df-40701656fce1-kube-api-access-nbw25" (OuterVolumeSpecName: "kube-api-access-nbw25") pod "4cd2ef1d-a118-41dc-a3df-40701656fce1" (UID: "4cd2ef1d-a118-41dc-a3df-40701656fce1"). InnerVolumeSpecName "kube-api-access-nbw25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.041671 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cd2ef1d-a118-41dc-a3df-40701656fce1" (UID: "4cd2ef1d-a118-41dc-a3df-40701656fce1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.051178 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cd2ef1d-a118-41dc-a3df-40701656fce1" (UID: "4cd2ef1d-a118-41dc-a3df-40701656fce1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.073321 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cd2ef1d-a118-41dc-a3df-40701656fce1" (UID: "4cd2ef1d-a118-41dc-a3df-40701656fce1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.076233 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-config" (OuterVolumeSpecName: "config") pod "4cd2ef1d-a118-41dc-a3df-40701656fce1" (UID: "4cd2ef1d-a118-41dc-a3df-40701656fce1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.092294 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.092339 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.092360 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbw25\" (UniqueName: \"kubernetes.io/projected/4cd2ef1d-a118-41dc-a3df-40701656fce1-kube-api-access-nbw25\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.092370 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.092379 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cd2ef1d-a118-41dc-a3df-40701656fce1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.780006 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" event={"ID":"4cd2ef1d-a118-41dc-a3df-40701656fce1","Type":"ContainerDied","Data":"799957562c6a684644ceaa691b22eefde6268ba84a08d8aba084adaabca64444"} Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.780077 4750 scope.go:117] "RemoveContainer" containerID="b0cb2e1a762bf19dbfc5a739ed367c430fbf6ae8fa4dc03aa090ac33fbb7caad" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.780151 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847586d7c9-bf5px" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.818660 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847586d7c9-bf5px"] Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.823298 4750 scope.go:117] "RemoveContainer" containerID="b4293cf124e3869fc0d4ad744d74f4fb6d0d211c48caf921fcad856b0cdc95e1" Oct 08 19:48:14 crc kubenswrapper[4750]: I1008 19:48:14.826991 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847586d7c9-bf5px"] Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.447728 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.448444 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-log" containerID="cri-o://fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56" gracePeriod=30 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.449064 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-api" containerID="cri-o://024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be" gracePeriod=30 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.480518 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.480897 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" containerName="nova-scheduler-scheduler" containerID="cri-o://995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e" gracePeriod=30 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.509242 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.509521 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3e13efc0-b986-4638-ac34-35f3cddc6a02" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7" gracePeriod=30 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.528748 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.529049 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="cc2df1e2-9676-4737-ba39-8769738c1c67" containerName="nova-cell0-conductor-conductor" containerID="cri-o://38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53" gracePeriod=30 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.547411 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.547701 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-log" containerID="cri-o://17db7b7d62c5405d816a7c6605095697f528b9aa5f53c481f52b6a83c0e97d16" gracePeriod=30 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.547891 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-metadata" containerID="cri-o://f2a3a9bf13471707dace7f6ba15d27874dcd6a0cfd1dfc9bac584ae342df6cfe" gracePeriod=30 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.625257 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="3e13efc0-b986-4638-ac34-35f3cddc6a02" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.66:6080/vnc_lite.html\": dial tcp 10.217.1.66:6080: connect: connection refused" Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.800529 4750 generic.go:334] "Generic (PLEG): container finished" podID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerID="fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56" exitCode=143 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.800601 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e51cf2e9-5e27-40d9-b708-f8efd4f6447e","Type":"ContainerDied","Data":"fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56"} Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.813478 4750 generic.go:334] "Generic (PLEG): container finished" podID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerID="17db7b7d62c5405d816a7c6605095697f528b9aa5f53c481f52b6a83c0e97d16" exitCode=143 Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.813536 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b26f9c3-60a6-4a77-beef-6436bfcfeee7","Type":"ContainerDied","Data":"17db7b7d62c5405d816a7c6605095697f528b9aa5f53c481f52b6a83c0e97d16"} Oct 08 19:48:15 crc kubenswrapper[4750]: I1008 19:48:15.944288 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.559965 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.746478 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d77sr\" (UniqueName: \"kubernetes.io/projected/3e13efc0-b986-4638-ac34-35f3cddc6a02-kube-api-access-d77sr\") pod \"3e13efc0-b986-4638-ac34-35f3cddc6a02\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.746580 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-combined-ca-bundle\") pod \"3e13efc0-b986-4638-ac34-35f3cddc6a02\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.746738 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-config-data\") pod \"3e13efc0-b986-4638-ac34-35f3cddc6a02\" (UID: \"3e13efc0-b986-4638-ac34-35f3cddc6a02\") " Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.749086 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd2ef1d-a118-41dc-a3df-40701656fce1" path="/var/lib/kubelet/pods/4cd2ef1d-a118-41dc-a3df-40701656fce1/volumes" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.754290 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e13efc0-b986-4638-ac34-35f3cddc6a02-kube-api-access-d77sr" (OuterVolumeSpecName: "kube-api-access-d77sr") pod "3e13efc0-b986-4638-ac34-35f3cddc6a02" (UID: "3e13efc0-b986-4638-ac34-35f3cddc6a02"). InnerVolumeSpecName "kube-api-access-d77sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.784846 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-config-data" (OuterVolumeSpecName: "config-data") pod "3e13efc0-b986-4638-ac34-35f3cddc6a02" (UID: "3e13efc0-b986-4638-ac34-35f3cddc6a02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.790743 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e13efc0-b986-4638-ac34-35f3cddc6a02" (UID: "3e13efc0-b986-4638-ac34-35f3cddc6a02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.830374 4750 generic.go:334] "Generic (PLEG): container finished" podID="3e13efc0-b986-4638-ac34-35f3cddc6a02" containerID="f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7" exitCode=0 Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.830435 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e13efc0-b986-4638-ac34-35f3cddc6a02","Type":"ContainerDied","Data":"f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7"} Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.830487 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e13efc0-b986-4638-ac34-35f3cddc6a02","Type":"ContainerDied","Data":"d7752f0494ec8643887e37ee355d475081ac4a67339ef3177e01013a1c027374"} Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.830481 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.830531 4750 scope.go:117] "RemoveContainer" containerID="f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.849896 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d77sr\" (UniqueName: \"kubernetes.io/projected/3e13efc0-b986-4638-ac34-35f3cddc6a02-kube-api-access-d77sr\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.849959 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.850168 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e13efc0-b986-4638-ac34-35f3cddc6a02-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.894101 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.897012 4750 scope.go:117] "RemoveContainer" containerID="f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7" Oct 08 19:48:16 crc kubenswrapper[4750]: E1008 19:48:16.897748 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7\": container with ID starting with f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7 not found: ID does not exist" containerID="f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.897798 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7"} err="failed to get container status \"f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7\": rpc error: code = NotFound desc = could not find container \"f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7\": container with ID starting with f368b41476e2bd13663b0886c95c633fbcda9f0712ecd69eb945e9c8de2742c7 not found: ID does not exist" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.914165 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.925297 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:48:16 crc kubenswrapper[4750]: E1008 19:48:16.925938 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerName="dnsmasq-dns" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.925966 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerName="dnsmasq-dns" Oct 08 19:48:16 crc kubenswrapper[4750]: E1008 19:48:16.926009 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e13efc0-b986-4638-ac34-35f3cddc6a02" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.926018 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e13efc0-b986-4638-ac34-35f3cddc6a02" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 19:48:16 crc kubenswrapper[4750]: E1008 19:48:16.926053 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerName="init" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.926062 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerName="init" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.926314 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd2ef1d-a118-41dc-a3df-40701656fce1" containerName="dnsmasq-dns" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.926361 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e13efc0-b986-4638-ac34-35f3cddc6a02" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.927328 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.931010 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 19:48:16 crc kubenswrapper[4750]: I1008 19:48:16.931795 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.054028 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5g46\" (UniqueName: \"kubernetes.io/projected/181c616c-4e29-44e5-bd5b-23754e802000-kube-api-access-j5g46\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.054129 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181c616c-4e29-44e5-bd5b-23754e802000-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.054163 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181c616c-4e29-44e5-bd5b-23754e802000-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.155687 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181c616c-4e29-44e5-bd5b-23754e802000-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.155725 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181c616c-4e29-44e5-bd5b-23754e802000-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.155869 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5g46\" (UniqueName: \"kubernetes.io/projected/181c616c-4e29-44e5-bd5b-23754e802000-kube-api-access-j5g46\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.162164 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181c616c-4e29-44e5-bd5b-23754e802000-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.164622 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181c616c-4e29-44e5-bd5b-23754e802000-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.172383 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5g46\" (UniqueName: \"kubernetes.io/projected/181c616c-4e29-44e5-bd5b-23754e802000-kube-api-access-j5g46\") pod \"nova-cell1-novncproxy-0\" (UID: \"181c616c-4e29-44e5-bd5b-23754e802000\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.373898 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.388226 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.565198 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-combined-ca-bundle\") pod \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.565251 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-config-data\") pod \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.565454 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bjj5\" (UniqueName: \"kubernetes.io/projected/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-kube-api-access-7bjj5\") pod \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\" (UID: \"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82\") " Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.573668 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-kube-api-access-7bjj5" (OuterVolumeSpecName: "kube-api-access-7bjj5") pod "131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" (UID: "131a8b9f-6bd7-4bbc-ba88-7947ec0fed82"). InnerVolumeSpecName "kube-api-access-7bjj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.597879 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" (UID: "131a8b9f-6bd7-4bbc-ba88-7947ec0fed82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.599494 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-config-data" (OuterVolumeSpecName: "config-data") pod "131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" (UID: "131a8b9f-6bd7-4bbc-ba88-7947ec0fed82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.669311 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.669366 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.669380 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bjj5\" (UniqueName: \"kubernetes.io/projected/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82-kube-api-access-7bjj5\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.853181 4750 generic.go:334] "Generic (PLEG): container finished" podID="131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" containerID="995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e" exitCode=0 Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.853240 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82","Type":"ContainerDied","Data":"995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e"} Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.853273 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"131a8b9f-6bd7-4bbc-ba88-7947ec0fed82","Type":"ContainerDied","Data":"3d88aaa9069f76f38ef5c586f1bdbe34135c8f080387db5bc5222ad96510a851"} Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.853290 4750 scope.go:117] "RemoveContainer" containerID="995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.853438 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.886673 4750 scope.go:117] "RemoveContainer" containerID="995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e" Oct 08 19:48:17 crc kubenswrapper[4750]: E1008 19:48:17.887189 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e\": container with ID starting with 995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e not found: ID does not exist" containerID="995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.887245 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e"} err="failed to get container status \"995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e\": rpc error: code = NotFound desc = could not find container \"995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e\": container with ID starting with 995c24e1ee0c39cf6c2eda91dd02c41c602b88b5e54d2b42a226c04af9f4a19e not found: ID does not exist" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.899536 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.914004 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.938648 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.982800 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:48:17 crc kubenswrapper[4750]: E1008 19:48:17.984205 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" containerName="nova-scheduler-scheduler" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.984309 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" containerName="nova-scheduler-scheduler" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.985686 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" containerName="nova-scheduler-scheduler" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.988444 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:48:17 crc kubenswrapper[4750]: I1008 19:48:17.992676 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.000851 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.183496 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7030ea-8b14-4b45-a722-a49d1eb31294-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.183945 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7030ea-8b14-4b45-a722-a49d1eb31294-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.183972 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsv96\" (UniqueName: \"kubernetes.io/projected/5f7030ea-8b14-4b45-a722-a49d1eb31294-kube-api-access-gsv96\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.287040 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7030ea-8b14-4b45-a722-a49d1eb31294-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.287122 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7030ea-8b14-4b45-a722-a49d1eb31294-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.287152 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsv96\" (UniqueName: \"kubernetes.io/projected/5f7030ea-8b14-4b45-a722-a49d1eb31294-kube-api-access-gsv96\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.297039 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7030ea-8b14-4b45-a722-a49d1eb31294-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.298776 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7030ea-8b14-4b45-a722-a49d1eb31294-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.321986 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsv96\" (UniqueName: \"kubernetes.io/projected/5f7030ea-8b14-4b45-a722-a49d1eb31294-kube-api-access-gsv96\") pod \"nova-scheduler-0\" (UID: \"5f7030ea-8b14-4b45-a722-a49d1eb31294\") " pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.396470 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.685942 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.77:8775/\": read tcp 10.217.0.2:39578->10.217.1.77:8775: read: connection reset by peer" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.686626 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.77:8775/\": read tcp 10.217.0.2:39564->10.217.1.77:8775: read: connection reset by peer" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.693348 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.699816 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-combined-ca-bundle\") pod \"cc2df1e2-9676-4737-ba39-8769738c1c67\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.744973 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc2df1e2-9676-4737-ba39-8769738c1c67" (UID: "cc2df1e2-9676-4737-ba39-8769738c1c67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.751102 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131a8b9f-6bd7-4bbc-ba88-7947ec0fed82" path="/var/lib/kubelet/pods/131a8b9f-6bd7-4bbc-ba88-7947ec0fed82/volumes" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.751879 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e13efc0-b986-4638-ac34-35f3cddc6a02" path="/var/lib/kubelet/pods/3e13efc0-b986-4638-ac34-35f3cddc6a02/volumes" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.801276 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-config-data\") pod \"cc2df1e2-9676-4737-ba39-8769738c1c67\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.801430 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl475\" (UniqueName: \"kubernetes.io/projected/cc2df1e2-9676-4737-ba39-8769738c1c67-kube-api-access-hl475\") pod \"cc2df1e2-9676-4737-ba39-8769738c1c67\" (UID: \"cc2df1e2-9676-4737-ba39-8769738c1c67\") " Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.802725 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.812199 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc2df1e2-9676-4737-ba39-8769738c1c67-kube-api-access-hl475" (OuterVolumeSpecName: "kube-api-access-hl475") pod "cc2df1e2-9676-4737-ba39-8769738c1c67" (UID: "cc2df1e2-9676-4737-ba39-8769738c1c67"). InnerVolumeSpecName "kube-api-access-hl475". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.829605 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-config-data" (OuterVolumeSpecName: "config-data") pod "cc2df1e2-9676-4737-ba39-8769738c1c67" (UID: "cc2df1e2-9676-4737-ba39-8769738c1c67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.880976 4750 generic.go:334] "Generic (PLEG): container finished" podID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerID="f2a3a9bf13471707dace7f6ba15d27874dcd6a0cfd1dfc9bac584ae342df6cfe" exitCode=0 Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.884161 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b26f9c3-60a6-4a77-beef-6436bfcfeee7","Type":"ContainerDied","Data":"f2a3a9bf13471707dace7f6ba15d27874dcd6a0cfd1dfc9bac584ae342df6cfe"} Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.885695 4750 generic.go:334] "Generic (PLEG): container finished" podID="cc2df1e2-9676-4737-ba39-8769738c1c67" containerID="38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53" exitCode=0 Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.885780 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cc2df1e2-9676-4737-ba39-8769738c1c67","Type":"ContainerDied","Data":"38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53"} Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.885820 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cc2df1e2-9676-4737-ba39-8769738c1c67","Type":"ContainerDied","Data":"334feecd3069e241f128e19cb978323f18947c29bac81eead1911cbd364495d1"} Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.885846 4750 scope.go:117] "RemoveContainer" containerID="38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.885889 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.892224 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"181c616c-4e29-44e5-bd5b-23754e802000","Type":"ContainerStarted","Data":"2dbfa9f3ae347504d9d3e2f2153796f2e85592c388436490b8e150bfd3a0268f"} Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.892284 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"181c616c-4e29-44e5-bd5b-23754e802000","Type":"ContainerStarted","Data":"345d2136948035d542f60bff622f7cad14046c62be059068a35fe784db677afa"} Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.905593 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2df1e2-9676-4737-ba39-8769738c1c67-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.905617 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl475\" (UniqueName: \"kubernetes.io/projected/cc2df1e2-9676-4737-ba39-8769738c1c67-kube-api-access-hl475\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.928669 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.941301 4750 scope.go:117] "RemoveContainer" containerID="38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53" Oct 08 19:48:18 crc kubenswrapper[4750]: E1008 19:48:18.944202 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53\": container with ID starting with 38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53 not found: ID does not exist" containerID="38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.944256 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53"} err="failed to get container status \"38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53\": rpc error: code = NotFound desc = could not find container \"38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53\": container with ID starting with 38bad996873ec5c7c4003b92ea64ed74b06e9b8a1f8e60564b58d4c843f27b53 not found: ID does not exist" Oct 08 19:48:18 crc kubenswrapper[4750]: I1008 19:48:18.943897 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.943862983 podStartE2EDuration="2.943862983s" podCreationTimestamp="2025-10-08 19:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:18.919718993 +0000 UTC m=+5854.832690006" watchObservedRunningTime="2025-10-08 19:48:18.943862983 +0000 UTC m=+5854.856834006" Oct 08 19:48:18 crc kubenswrapper[4750]: W1008 19:48:18.958246 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f7030ea_8b14_4b45_a722_a49d1eb31294.slice/crio-6e5372e5344ec901911bad7b00fecb2b4d7637325944d600032aa26fcdcaadc0 WatchSource:0}: Error finding container 6e5372e5344ec901911bad7b00fecb2b4d7637325944d600032aa26fcdcaadc0: Status 404 returned error can't find the container with id 6e5372e5344ec901911bad7b00fecb2b4d7637325944d600032aa26fcdcaadc0 Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.012672 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.013298 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="5a8c7c05-0b87-4059-9184-111d44c1e83b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://e1a1a8b77e3fdff45ea66080f6f9034e48092169ea6217c14a1ce1ac9916b5d7" gracePeriod=30 Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.212635 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.226099 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.238054 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.241947 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:48:19 crc kubenswrapper[4750]: E1008 19:48:19.242496 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-log" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.242514 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-log" Oct 08 19:48:19 crc kubenswrapper[4750]: E1008 19:48:19.242533 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-metadata" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.242539 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-metadata" Oct 08 19:48:19 crc kubenswrapper[4750]: E1008 19:48:19.242599 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2df1e2-9676-4737-ba39-8769738c1c67" containerName="nova-cell0-conductor-conductor" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.242606 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2df1e2-9676-4737-ba39-8769738c1c67" containerName="nova-cell0-conductor-conductor" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.242795 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-metadata" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.242819 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" containerName="nova-metadata-log" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.242831 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc2df1e2-9676-4737-ba39-8769738c1c67" containerName="nova-cell0-conductor-conductor" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.243599 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.248296 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.262589 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.322179 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-config-data\") pod \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.322605 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-logs\") pod \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.322736 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-combined-ca-bundle\") pod \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.322935 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clc58\" (UniqueName: \"kubernetes.io/projected/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-kube-api-access-clc58\") pod \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\" (UID: \"2b26f9c3-60a6-4a77-beef-6436bfcfeee7\") " Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.325331 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-logs" (OuterVolumeSpecName: "logs") pod "2b26f9c3-60a6-4a77-beef-6436bfcfeee7" (UID: "2b26f9c3-60a6-4a77-beef-6436bfcfeee7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.329075 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-kube-api-access-clc58" (OuterVolumeSpecName: "kube-api-access-clc58") pod "2b26f9c3-60a6-4a77-beef-6436bfcfeee7" (UID: "2b26f9c3-60a6-4a77-beef-6436bfcfeee7"). InnerVolumeSpecName "kube-api-access-clc58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.420640 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-config-data" (OuterVolumeSpecName: "config-data") pod "2b26f9c3-60a6-4a77-beef-6436bfcfeee7" (UID: "2b26f9c3-60a6-4a77-beef-6436bfcfeee7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.425118 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.425223 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rqqn\" (UniqueName: \"kubernetes.io/projected/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-kube-api-access-5rqqn\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.425257 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.425344 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clc58\" (UniqueName: \"kubernetes.io/projected/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-kube-api-access-clc58\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.425362 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.425374 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.471432 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b26f9c3-60a6-4a77-beef-6436bfcfeee7" (UID: "2b26f9c3-60a6-4a77-beef-6436bfcfeee7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.493140 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.531279 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.531401 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rqqn\" (UniqueName: \"kubernetes.io/projected/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-kube-api-access-5rqqn\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.531445 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.531571 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b26f9c3-60a6-4a77-beef-6436bfcfeee7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.546305 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.548844 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.563448 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rqqn\" (UniqueName: \"kubernetes.io/projected/9490ebbd-98b4-45dc-9ce1-afd9fc3c179c-kube-api-access-5rqqn\") pod \"nova-cell0-conductor-0\" (UID: \"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c\") " pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.626002 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.632324 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-config-data\") pod \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.632474 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-combined-ca-bundle\") pod \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.632670 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-logs\") pod \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.632694 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw9zx\" (UniqueName: \"kubernetes.io/projected/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-kube-api-access-mw9zx\") pod \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\" (UID: \"e51cf2e9-5e27-40d9-b708-f8efd4f6447e\") " Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.637304 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-logs" (OuterVolumeSpecName: "logs") pod "e51cf2e9-5e27-40d9-b708-f8efd4f6447e" (UID: "e51cf2e9-5e27-40d9-b708-f8efd4f6447e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.637477 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-kube-api-access-mw9zx" (OuterVolumeSpecName: "kube-api-access-mw9zx") pod "e51cf2e9-5e27-40d9-b708-f8efd4f6447e" (UID: "e51cf2e9-5e27-40d9-b708-f8efd4f6447e"). InnerVolumeSpecName "kube-api-access-mw9zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.671809 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-config-data" (OuterVolumeSpecName: "config-data") pod "e51cf2e9-5e27-40d9-b708-f8efd4f6447e" (UID: "e51cf2e9-5e27-40d9-b708-f8efd4f6447e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.684627 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e51cf2e9-5e27-40d9-b708-f8efd4f6447e" (UID: "e51cf2e9-5e27-40d9-b708-f8efd4f6447e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.734955 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.735006 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.735018 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw9zx\" (UniqueName: \"kubernetes.io/projected/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-kube-api-access-mw9zx\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.735030 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51cf2e9-5e27-40d9-b708-f8efd4f6447e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.909013 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7030ea-8b14-4b45-a722-a49d1eb31294","Type":"ContainerStarted","Data":"c13239fb5195b91713cba945910ec813853653878ffc84a62a604f3cbf39f70a"} Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.909509 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7030ea-8b14-4b45-a722-a49d1eb31294","Type":"ContainerStarted","Data":"6e5372e5344ec901911bad7b00fecb2b4d7637325944d600032aa26fcdcaadc0"} Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.912497 4750 generic.go:334] "Generic (PLEG): container finished" podID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerID="024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be" exitCode=0 Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.912659 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e51cf2e9-5e27-40d9-b708-f8efd4f6447e","Type":"ContainerDied","Data":"024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be"} Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.912658 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.912702 4750 scope.go:117] "RemoveContainer" containerID="024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.912689 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e51cf2e9-5e27-40d9-b708-f8efd4f6447e","Type":"ContainerDied","Data":"e395ee5c75fb3c927c4e642b3ac8c326978ec31623771e785923fd6f82518b7d"} Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.916212 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b26f9c3-60a6-4a77-beef-6436bfcfeee7","Type":"ContainerDied","Data":"f561a813331f508a21923c4b4024fe3e17b25a9e5a9473dad2cd494631b48dce"} Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.916434 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.946353 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.946331391 podStartE2EDuration="2.946331391s" podCreationTimestamp="2025-10-08 19:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:19.940419275 +0000 UTC m=+5855.853390288" watchObservedRunningTime="2025-10-08 19:48:19.946331391 +0000 UTC m=+5855.859302404" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.949404 4750 scope.go:117] "RemoveContainer" containerID="fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.988445 4750 scope.go:117] "RemoveContainer" containerID="024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.988657 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:48:19 crc kubenswrapper[4750]: E1008 19:48:19.991048 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be\": container with ID starting with 024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be not found: ID does not exist" containerID="024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.991081 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be"} err="failed to get container status \"024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be\": rpc error: code = NotFound desc = could not find container \"024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be\": container with ID starting with 024f06888bf9b7e21309520201ad1ff497fa95d13c42cf2b279e1846c9c9f1be not found: ID does not exist" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.991104 4750 scope.go:117] "RemoveContainer" containerID="fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56" Oct 08 19:48:19 crc kubenswrapper[4750]: E1008 19:48:19.993978 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56\": container with ID starting with fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56 not found: ID does not exist" containerID="fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.994004 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56"} err="failed to get container status \"fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56\": rpc error: code = NotFound desc = could not find container \"fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56\": container with ID starting with fd068ac7427a6fe7ef857ef184d0317fbc51eff0b5dccf026f0e15a5b6d93f56 not found: ID does not exist" Oct 08 19:48:19 crc kubenswrapper[4750]: I1008 19:48:19.994019 4750 scope.go:117] "RemoveContainer" containerID="f2a3a9bf13471707dace7f6ba15d27874dcd6a0cfd1dfc9bac584ae342df6cfe" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.033986 4750 scope.go:117] "RemoveContainer" containerID="17db7b7d62c5405d816a7c6605095697f528b9aa5f53c481f52b6a83c0e97d16" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.055625 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.066281 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.072143 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.081768 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: E1008 19:48:20.082356 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-api" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.082377 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-api" Oct 08 19:48:20 crc kubenswrapper[4750]: E1008 19:48:20.082398 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-log" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.082405 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-log" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.082673 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-api" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.082699 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" containerName="nova-api-log" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.084001 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.088958 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.090410 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.100026 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.102167 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.105933 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.160512 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.234289 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.254019 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04268d95-5a3e-416f-b0c4-2b730bbba40f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.254083 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-logs\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.254272 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.254379 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brs5b\" (UniqueName: \"kubernetes.io/projected/04268d95-5a3e-416f-b0c4-2b730bbba40f-kube-api-access-brs5b\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.254424 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04268d95-5a3e-416f-b0c4-2b730bbba40f-config-data\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.254571 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04268d95-5a3e-416f-b0c4-2b730bbba40f-logs\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.254689 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88tnt\" (UniqueName: \"kubernetes.io/projected/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-kube-api-access-88tnt\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.254755 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-config-data\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.359880 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88tnt\" (UniqueName: \"kubernetes.io/projected/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-kube-api-access-88tnt\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.360286 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-config-data\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.360330 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-logs\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.360355 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04268d95-5a3e-416f-b0c4-2b730bbba40f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.360431 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.360486 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brs5b\" (UniqueName: \"kubernetes.io/projected/04268d95-5a3e-416f-b0c4-2b730bbba40f-kube-api-access-brs5b\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.360515 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04268d95-5a3e-416f-b0c4-2b730bbba40f-config-data\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.360535 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04268d95-5a3e-416f-b0c4-2b730bbba40f-logs\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.361126 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04268d95-5a3e-416f-b0c4-2b730bbba40f-logs\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.362145 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-logs\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.367766 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04268d95-5a3e-416f-b0c4-2b730bbba40f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.368264 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04268d95-5a3e-416f-b0c4-2b730bbba40f-config-data\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.369112 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-config-data\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.369823 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.375811 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88tnt\" (UniqueName: \"kubernetes.io/projected/c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b-kube-api-access-88tnt\") pod \"nova-metadata-0\" (UID: \"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b\") " pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.383963 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brs5b\" (UniqueName: \"kubernetes.io/projected/04268d95-5a3e-416f-b0c4-2b730bbba40f-kube-api-access-brs5b\") pod \"nova-api-0\" (UID: \"04268d95-5a3e-416f-b0c4-2b730bbba40f\") " pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.404626 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.447907 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.751776 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b26f9c3-60a6-4a77-beef-6436bfcfeee7" path="/var/lib/kubelet/pods/2b26f9c3-60a6-4a77-beef-6436bfcfeee7/volumes" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.753234 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc2df1e2-9676-4737-ba39-8769738c1c67" path="/var/lib/kubelet/pods/cc2df1e2-9676-4737-ba39-8769738c1c67/volumes" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.753881 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51cf2e9-5e27-40d9-b708-f8efd4f6447e" path="/var/lib/kubelet/pods/e51cf2e9-5e27-40d9-b708-f8efd4f6447e/volumes" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.938903 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 19:48:20 crc kubenswrapper[4750]: W1008 19:48:20.940286 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc14ecf98_f4b1_4a8c_9057_db39c2cbdd5b.slice/crio-aecf7b44a8118202ef7f96416f7d1726a7c81615cb46cfbb36fafda4e68fe1e0 WatchSource:0}: Error finding container aecf7b44a8118202ef7f96416f7d1726a7c81615cb46cfbb36fafda4e68fe1e0: Status 404 returned error can't find the container with id aecf7b44a8118202ef7f96416f7d1726a7c81615cb46cfbb36fafda4e68fe1e0 Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.960944 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c","Type":"ContainerStarted","Data":"b2a598f4c0e63acbf8d55a84f5806a78689693e15041d47970b9841887de7e1a"} Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.961009 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9490ebbd-98b4-45dc-9ce1-afd9fc3c179c","Type":"ContainerStarted","Data":"8c8e4f91257cab988d402b2ccec302ca909bb9f9a020d4df4e0c1440c3ae7317"} Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.961312 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:20 crc kubenswrapper[4750]: I1008 19:48:20.979858 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.979835452 podStartE2EDuration="1.979835452s" podCreationTimestamp="2025-10-08 19:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:20.975797791 +0000 UTC m=+5856.888768814" watchObservedRunningTime="2025-10-08 19:48:20.979835452 +0000 UTC m=+5856.892806485" Oct 08 19:48:21 crc kubenswrapper[4750]: I1008 19:48:21.022642 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 19:48:21 crc kubenswrapper[4750]: W1008 19:48:21.036018 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04268d95_5a3e_416f_b0c4_2b730bbba40f.slice/crio-d60547ffed9d6a404b6bbd33ac2d96f9bbdfe85cef6f3ceb7d0b2a9e6c87d67d WatchSource:0}: Error finding container d60547ffed9d6a404b6bbd33ac2d96f9bbdfe85cef6f3ceb7d0b2a9e6c87d67d: Status 404 returned error can't find the container with id d60547ffed9d6a404b6bbd33ac2d96f9bbdfe85cef6f3ceb7d0b2a9e6c87d67d Oct 08 19:48:21 crc kubenswrapper[4750]: I1008 19:48:21.973000 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b","Type":"ContainerStarted","Data":"288f5a319e5ea42e677d2b21f74e1d2ead48d53937104fcaa09f88ff1ba55884"} Oct 08 19:48:21 crc kubenswrapper[4750]: I1008 19:48:21.973617 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b","Type":"ContainerStarted","Data":"4c41a654425ca2ba901f4ebeee0b0003128be4608a8aecac24a12411b7e8845a"} Oct 08 19:48:21 crc kubenswrapper[4750]: I1008 19:48:21.973635 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b","Type":"ContainerStarted","Data":"aecf7b44a8118202ef7f96416f7d1726a7c81615cb46cfbb36fafda4e68fe1e0"} Oct 08 19:48:21 crc kubenswrapper[4750]: I1008 19:48:21.975689 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"04268d95-5a3e-416f-b0c4-2b730bbba40f","Type":"ContainerStarted","Data":"b3d37da14ac0dff94f0d44fe719585733673cb1f40516118481cacb3567aecf2"} Oct 08 19:48:21 crc kubenswrapper[4750]: I1008 19:48:21.975735 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"04268d95-5a3e-416f-b0c4-2b730bbba40f","Type":"ContainerStarted","Data":"deea53c051bc70a746687297fa607c2eda4a60e3fe4cd8d75b029ff81f81b1bc"} Oct 08 19:48:21 crc kubenswrapper[4750]: I1008 19:48:21.975749 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"04268d95-5a3e-416f-b0c4-2b730bbba40f","Type":"ContainerStarted","Data":"d60547ffed9d6a404b6bbd33ac2d96f9bbdfe85cef6f3ceb7d0b2a9e6c87d67d"} Oct 08 19:48:22 crc kubenswrapper[4750]: I1008 19:48:22.014313 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.014291376 podStartE2EDuration="3.014291376s" podCreationTimestamp="2025-10-08 19:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:22.003168079 +0000 UTC m=+5857.916139092" watchObservedRunningTime="2025-10-08 19:48:22.014291376 +0000 UTC m=+5857.927262389" Oct 08 19:48:22 crc kubenswrapper[4750]: I1008 19:48:22.031936 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.031907833 podStartE2EDuration="3.031907833s" podCreationTimestamp="2025-10-08 19:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:22.026512179 +0000 UTC m=+5857.939483182" watchObservedRunningTime="2025-10-08 19:48:22.031907833 +0000 UTC m=+5857.944878846" Oct 08 19:48:22 crc kubenswrapper[4750]: I1008 19:48:22.374929 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:22 crc kubenswrapper[4750]: I1008 19:48:22.740069 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:48:22 crc kubenswrapper[4750]: E1008 19:48:22.740670 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.015807 4750 generic.go:334] "Generic (PLEG): container finished" podID="5a8c7c05-0b87-4059-9184-111d44c1e83b" containerID="e1a1a8b77e3fdff45ea66080f6f9034e48092169ea6217c14a1ce1ac9916b5d7" exitCode=0 Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.015941 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5a8c7c05-0b87-4059-9184-111d44c1e83b","Type":"ContainerDied","Data":"e1a1a8b77e3fdff45ea66080f6f9034e48092169ea6217c14a1ce1ac9916b5d7"} Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.184018 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.333663 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhqqn\" (UniqueName: \"kubernetes.io/projected/5a8c7c05-0b87-4059-9184-111d44c1e83b-kube-api-access-mhqqn\") pod \"5a8c7c05-0b87-4059-9184-111d44c1e83b\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.333720 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-combined-ca-bundle\") pod \"5a8c7c05-0b87-4059-9184-111d44c1e83b\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.333790 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-config-data\") pod \"5a8c7c05-0b87-4059-9184-111d44c1e83b\" (UID: \"5a8c7c05-0b87-4059-9184-111d44c1e83b\") " Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.343270 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a8c7c05-0b87-4059-9184-111d44c1e83b-kube-api-access-mhqqn" (OuterVolumeSpecName: "kube-api-access-mhqqn") pod "5a8c7c05-0b87-4059-9184-111d44c1e83b" (UID: "5a8c7c05-0b87-4059-9184-111d44c1e83b"). InnerVolumeSpecName "kube-api-access-mhqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.362516 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a8c7c05-0b87-4059-9184-111d44c1e83b" (UID: "5a8c7c05-0b87-4059-9184-111d44c1e83b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.372440 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-config-data" (OuterVolumeSpecName: "config-data") pod "5a8c7c05-0b87-4059-9184-111d44c1e83b" (UID: "5a8c7c05-0b87-4059-9184-111d44c1e83b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.397031 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.436686 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhqqn\" (UniqueName: \"kubernetes.io/projected/5a8c7c05-0b87-4059-9184-111d44c1e83b-kube-api-access-mhqqn\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.436723 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:23 crc kubenswrapper[4750]: I1008 19:48:23.436733 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8c7c05-0b87-4059-9184-111d44c1e83b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.035429 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5a8c7c05-0b87-4059-9184-111d44c1e83b","Type":"ContainerDied","Data":"1c2dfede709f7ba1eaf80ec8dce636fe541e4a9b67234b74c4b3054a9a9a54f4"} Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.036407 4750 scope.go:117] "RemoveContainer" containerID="e1a1a8b77e3fdff45ea66080f6f9034e48092169ea6217c14a1ce1ac9916b5d7" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.035691 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.096733 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.115031 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.141657 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:48:24 crc kubenswrapper[4750]: E1008 19:48:24.142236 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8c7c05-0b87-4059-9184-111d44c1e83b" containerName="nova-cell1-conductor-conductor" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.142256 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8c7c05-0b87-4059-9184-111d44c1e83b" containerName="nova-cell1-conductor-conductor" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.142515 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a8c7c05-0b87-4059-9184-111d44c1e83b" containerName="nova-cell1-conductor-conductor" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.143415 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.143582 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.157298 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.261084 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.261446 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.261846 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjcz8\" (UniqueName: \"kubernetes.io/projected/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-kube-api-access-sjcz8\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.365431 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.365838 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.366008 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjcz8\" (UniqueName: \"kubernetes.io/projected/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-kube-api-access-sjcz8\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.386345 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.387691 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.391056 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjcz8\" (UniqueName: \"kubernetes.io/projected/41ac8be4-8bbb-4e70-ba9a-4e5a995da828-kube-api-access-sjcz8\") pod \"nova-cell1-conductor-0\" (UID: \"41ac8be4-8bbb-4e70-ba9a-4e5a995da828\") " pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.472397 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.750076 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a8c7c05-0b87-4059-9184-111d44c1e83b" path="/var/lib/kubelet/pods/5a8c7c05-0b87-4059-9184-111d44c1e83b/volumes" Oct 08 19:48:24 crc kubenswrapper[4750]: I1008 19:48:24.982577 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 19:48:25 crc kubenswrapper[4750]: W1008 19:48:25.000933 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ac8be4_8bbb_4e70_ba9a_4e5a995da828.slice/crio-b04d298acb0189bb84763732ad40f5e3da78c6eedb3c1387abf165d9387ac6c9 WatchSource:0}: Error finding container b04d298acb0189bb84763732ad40f5e3da78c6eedb3c1387abf165d9387ac6c9: Status 404 returned error can't find the container with id b04d298acb0189bb84763732ad40f5e3da78c6eedb3c1387abf165d9387ac6c9 Oct 08 19:48:25 crc kubenswrapper[4750]: I1008 19:48:25.051660 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"41ac8be4-8bbb-4e70-ba9a-4e5a995da828","Type":"ContainerStarted","Data":"b04d298acb0189bb84763732ad40f5e3da78c6eedb3c1387abf165d9387ac6c9"} Oct 08 19:48:25 crc kubenswrapper[4750]: I1008 19:48:25.404871 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 19:48:25 crc kubenswrapper[4750]: I1008 19:48:25.405938 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 19:48:26 crc kubenswrapper[4750]: I1008 19:48:26.066178 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"41ac8be4-8bbb-4e70-ba9a-4e5a995da828","Type":"ContainerStarted","Data":"1a1794070b76f91054c0db38e9732abb36ace5ceb85705e0e79451846e46a30c"} Oct 08 19:48:26 crc kubenswrapper[4750]: I1008 19:48:26.084228 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.084205313 podStartE2EDuration="2.084205313s" podCreationTimestamp="2025-10-08 19:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:26.081878525 +0000 UTC m=+5861.994849588" watchObservedRunningTime="2025-10-08 19:48:26.084205313 +0000 UTC m=+5861.997176336" Oct 08 19:48:27 crc kubenswrapper[4750]: I1008 19:48:27.081742 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:27 crc kubenswrapper[4750]: I1008 19:48:27.375400 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:27 crc kubenswrapper[4750]: I1008 19:48:27.398101 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:28 crc kubenswrapper[4750]: I1008 19:48:28.110652 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 19:48:28 crc kubenswrapper[4750]: I1008 19:48:28.397886 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 19:48:28 crc kubenswrapper[4750]: I1008 19:48:28.475492 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 19:48:29 crc kubenswrapper[4750]: I1008 19:48:29.144274 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 19:48:29 crc kubenswrapper[4750]: I1008 19:48:29.666284 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 19:48:30 crc kubenswrapper[4750]: I1008 19:48:30.405751 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 19:48:30 crc kubenswrapper[4750]: I1008 19:48:30.405821 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 19:48:30 crc kubenswrapper[4750]: I1008 19:48:30.449201 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 19:48:30 crc kubenswrapper[4750]: I1008 19:48:30.449593 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 19:48:31 crc kubenswrapper[4750]: I1008 19:48:31.488839 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.90:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:48:31 crc kubenswrapper[4750]: I1008 19:48:31.488855 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.90:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:48:31 crc kubenswrapper[4750]: I1008 19:48:31.570841 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="04268d95-5a3e-416f-b0c4-2b730bbba40f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.91:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:48:31 crc kubenswrapper[4750]: I1008 19:48:31.571622 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="04268d95-5a3e-416f-b0c4-2b730bbba40f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.91:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 19:48:34 crc kubenswrapper[4750]: I1008 19:48:34.503956 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 19:48:34 crc kubenswrapper[4750]: I1008 19:48:34.751041 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:48:34 crc kubenswrapper[4750]: E1008 19:48:34.752757 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.857132 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.858890 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.871307 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.898889 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.938975 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.939463 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-scripts\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.939497 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.939756 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e640d7c4-5b73-40af-ad2c-96fbfebe2956-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.939882 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:35 crc kubenswrapper[4750]: I1008 19:48:35.939909 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b57lq\" (UniqueName: \"kubernetes.io/projected/e640d7c4-5b73-40af-ad2c-96fbfebe2956-kube-api-access-b57lq\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.041827 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.041943 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-scripts\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.042026 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.042103 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e640d7c4-5b73-40af-ad2c-96fbfebe2956-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.042187 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.042234 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b57lq\" (UniqueName: \"kubernetes.io/projected/e640d7c4-5b73-40af-ad2c-96fbfebe2956-kube-api-access-b57lq\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.043339 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e640d7c4-5b73-40af-ad2c-96fbfebe2956-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.050340 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.050479 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.058276 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.059802 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-scripts\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.061884 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b57lq\" (UniqueName: \"kubernetes.io/projected/e640d7c4-5b73-40af-ad2c-96fbfebe2956-kube-api-access-b57lq\") pod \"cinder-scheduler-0\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.233466 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 19:48:36 crc kubenswrapper[4750]: I1008 19:48:36.767247 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:36 crc kubenswrapper[4750]: W1008 19:48:36.775378 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode640d7c4_5b73_40af_ad2c_96fbfebe2956.slice/crio-7f24ddc65ebaabb12e7ef51aab5fe3caa79ce1fef6c369dd055a2fd55a7c62ca WatchSource:0}: Error finding container 7f24ddc65ebaabb12e7ef51aab5fe3caa79ce1fef6c369dd055a2fd55a7c62ca: Status 404 returned error can't find the container with id 7f24ddc65ebaabb12e7ef51aab5fe3caa79ce1fef6c369dd055a2fd55a7c62ca Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.095207 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.095875 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api-log" containerID="cri-o://84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973" gracePeriod=30 Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.096385 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api" containerID="cri-o://760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a" gracePeriod=30 Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.192921 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e640d7c4-5b73-40af-ad2c-96fbfebe2956","Type":"ContainerStarted","Data":"7f24ddc65ebaabb12e7ef51aab5fe3caa79ce1fef6c369dd055a2fd55a7c62ca"} Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.721338 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.724033 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.726346 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.745747 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779467 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779583 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-run\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779602 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779624 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779652 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779678 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779696 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779722 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779739 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-dev\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779763 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3f4a0250-42a7-43df-99f6-71bfe6696278-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779778 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779808 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-sys\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779823 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779854 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779878 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4hn\" (UniqueName: \"kubernetes.io/projected/3f4a0250-42a7-43df-99f6-71bfe6696278-kube-api-access-rg4hn\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.779902 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.883320 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-run\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.883364 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.883849 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.884005 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.884094 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.884116 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-run\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.885752 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.885809 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.885863 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.885905 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-dev\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.885931 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3f4a0250-42a7-43df-99f6-71bfe6696278-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.885955 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886049 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-sys\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886101 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-sys\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886150 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886200 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886259 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4hn\" (UniqueName: \"kubernetes.io/projected/3f4a0250-42a7-43df-99f6-71bfe6696278-kube-api-access-rg4hn\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886338 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886461 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886485 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886604 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886609 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-dev\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.886854 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.888008 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.888119 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.888332 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.888615 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f4a0250-42a7-43df-99f6-71bfe6696278-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.888619 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.890412 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.891890 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f4a0250-42a7-43df-99f6-71bfe6696278-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.892644 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3f4a0250-42a7-43df-99f6-71bfe6696278-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:37 crc kubenswrapper[4750]: I1008 19:48:37.908218 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4hn\" (UniqueName: \"kubernetes.io/projected/3f4a0250-42a7-43df-99f6-71bfe6696278-kube-api-access-rg4hn\") pod \"cinder-volume-volume1-0\" (UID: \"3f4a0250-42a7-43df-99f6-71bfe6696278\") " pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.052678 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.216917 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e640d7c4-5b73-40af-ad2c-96fbfebe2956","Type":"ContainerStarted","Data":"da84b1b5434fd0a47bbd6d7b42d6978e96ea105eed66c6a7f9148aa47a3136d2"} Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.217463 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e640d7c4-5b73-40af-ad2c-96fbfebe2956","Type":"ContainerStarted","Data":"5557b3fe256cbb9e4f581eb823c73c351f79c58cc04a52d12b13d1505ba9a7db"} Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.234000 4750 generic.go:334] "Generic (PLEG): container finished" podID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerID="84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973" exitCode=143 Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.234085 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f","Type":"ContainerDied","Data":"84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973"} Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.241376 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.241352725 podStartE2EDuration="3.241352725s" podCreationTimestamp="2025-10-08 19:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:38.238797522 +0000 UTC m=+5874.151768535" watchObservedRunningTime="2025-10-08 19:48:38.241352725 +0000 UTC m=+5874.154323758" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.458821 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 08 19:48:38 crc kubenswrapper[4750]: W1008 19:48:38.470319 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4a0250_42a7_43df_99f6_71bfe6696278.slice/crio-0f6b9c2d857337a48402e6760a2eb747d5c45b4c9b6a3cb938f8d3876b1f91d9 WatchSource:0}: Error finding container 0f6b9c2d857337a48402e6760a2eb747d5c45b4c9b6a3cb938f8d3876b1f91d9: Status 404 returned error can't find the container with id 0f6b9c2d857337a48402e6760a2eb747d5c45b4c9b6a3cb938f8d3876b1f91d9 Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.473336 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.526439 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.531981 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.538002 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.550458 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609606 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609669 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-dev\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609710 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609732 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-scripts\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609764 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w476q\" (UniqueName: \"kubernetes.io/projected/e7b130b0-4e14-4a52-b944-1788e309b0ce-kube-api-access-w476q\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609859 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-lib-modules\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609890 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-run\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609912 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609933 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609973 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-sys\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.609992 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-config-data\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.610025 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7b130b0-4e14-4a52-b944-1788e309b0ce-ceph\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.610047 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.610076 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.610120 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.610148 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712391 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-lib-modules\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712461 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-run\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712488 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712530 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712595 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-sys\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712618 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-config-data\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712646 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7b130b0-4e14-4a52-b944-1788e309b0ce-ceph\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712654 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712667 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712700 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-run\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712786 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-sys\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712762 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712796 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712863 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-lib-modules\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712932 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.712990 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713022 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713047 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713070 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-dev\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713099 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713099 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713115 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-scripts\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713187 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w476q\" (UniqueName: \"kubernetes.io/projected/e7b130b0-4e14-4a52-b944-1788e309b0ce-kube-api-access-w476q\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713188 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-dev\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713282 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.713503 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e7b130b0-4e14-4a52-b944-1788e309b0ce-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.722400 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-config-data\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.723107 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7b130b0-4e14-4a52-b944-1788e309b0ce-ceph\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.725626 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.727323 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.733247 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b130b0-4e14-4a52-b944-1788e309b0ce-scripts\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.735019 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w476q\" (UniqueName: \"kubernetes.io/projected/e7b130b0-4e14-4a52-b944-1788e309b0ce-kube-api-access-w476q\") pod \"cinder-backup-0\" (UID: \"e7b130b0-4e14-4a52-b944-1788e309b0ce\") " pod="openstack/cinder-backup-0" Oct 08 19:48:38 crc kubenswrapper[4750]: I1008 19:48:38.857741 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 08 19:48:39 crc kubenswrapper[4750]: I1008 19:48:39.261829 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"3f4a0250-42a7-43df-99f6-71bfe6696278","Type":"ContainerStarted","Data":"0f6b9c2d857337a48402e6760a2eb747d5c45b4c9b6a3cb938f8d3876b1f91d9"} Oct 08 19:48:39 crc kubenswrapper[4750]: I1008 19:48:39.548526 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.247746 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.86:8776/healthcheck\": read tcp 10.217.0.2:40662->10.217.1.86:8776: read: connection reset by peer" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.276079 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"3f4a0250-42a7-43df-99f6-71bfe6696278","Type":"ContainerStarted","Data":"c19acf42a9f1e73ddf2192e980c1840754ead3cdcbd7e1377c27a3a0aa394872"} Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.276598 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"3f4a0250-42a7-43df-99f6-71bfe6696278","Type":"ContainerStarted","Data":"4f35e48c0bf561260e71f2b65baae34383365f5430f9e9d8cd1aff2c9e70526b"} Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.283020 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e7b130b0-4e14-4a52-b944-1788e309b0ce","Type":"ContainerStarted","Data":"e38bb423ed263f10123781968c3153533b728c18034dfdc26ae4e4cfd216a5ee"} Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.315080 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.085257591 podStartE2EDuration="3.315056412s" podCreationTimestamp="2025-10-08 19:48:37 +0000 UTC" firstStartedPulling="2025-10-08 19:48:38.472989376 +0000 UTC m=+5874.385960499" lastFinishedPulling="2025-10-08 19:48:39.702788307 +0000 UTC m=+5875.615759320" observedRunningTime="2025-10-08 19:48:40.305971787 +0000 UTC m=+5876.218942800" watchObservedRunningTime="2025-10-08 19:48:40.315056412 +0000 UTC m=+5876.228027425" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.409034 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.409416 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.453368 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.454168 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.458087 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.460717 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.481237 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.783936 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.873618 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkvfv\" (UniqueName: \"kubernetes.io/projected/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-kube-api-access-gkvfv\") pod \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.873795 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-combined-ca-bundle\") pod \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.873821 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-scripts\") pod \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.873872 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-etc-machine-id\") pod \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.873944 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-logs\") pod \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.874017 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data-custom\") pod \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.874073 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data\") pod \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\" (UID: \"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f\") " Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.874659 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" (UID: "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.875424 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-logs" (OuterVolumeSpecName: "logs") pod "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" (UID: "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.879855 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-kube-api-access-gkvfv" (OuterVolumeSpecName: "kube-api-access-gkvfv") pod "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" (UID: "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f"). InnerVolumeSpecName "kube-api-access-gkvfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.880053 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-scripts" (OuterVolumeSpecName: "scripts") pod "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" (UID: "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.881458 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" (UID: "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.910306 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" (UID: "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.961269 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data" (OuterVolumeSpecName: "config-data") pod "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" (UID: "9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.977244 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkvfv\" (UniqueName: \"kubernetes.io/projected/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-kube-api-access-gkvfv\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.977706 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.977722 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.977734 4750 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.977747 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.977761 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:40 crc kubenswrapper[4750]: I1008 19:48:40.977772 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.235194 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.303990 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e7b130b0-4e14-4a52-b944-1788e309b0ce","Type":"ContainerStarted","Data":"05d433ae95648b10dc8583cae47b0f82fe4562df69e1b15fdf036188934c9023"} Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.304535 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e7b130b0-4e14-4a52-b944-1788e309b0ce","Type":"ContainerStarted","Data":"d617cbfb0f10db2656021e7083c3359cd3cfc835af8b128360b8ba5a814fd815"} Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.308189 4750 generic.go:334] "Generic (PLEG): container finished" podID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerID="760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a" exitCode=0 Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.308266 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.308303 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f","Type":"ContainerDied","Data":"760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a"} Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.308334 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f","Type":"ContainerDied","Data":"df538dcf37472e9c76eba5d9486784f08077935fbfb4aafb50a3bb2c2e55a2bd"} Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.308358 4750 scope.go:117] "RemoveContainer" containerID="760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.309387 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.315946 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.319846 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.336330 4750 scope.go:117] "RemoveContainer" containerID="84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.344459 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.2621796180000002 podStartE2EDuration="3.344431351s" podCreationTimestamp="2025-10-08 19:48:38 +0000 UTC" firstStartedPulling="2025-10-08 19:48:39.64138471 +0000 UTC m=+5875.554355723" lastFinishedPulling="2025-10-08 19:48:40.723636443 +0000 UTC m=+5876.636607456" observedRunningTime="2025-10-08 19:48:41.329895829 +0000 UTC m=+5877.242866862" watchObservedRunningTime="2025-10-08 19:48:41.344431351 +0000 UTC m=+5877.257402364" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.390169 4750 scope.go:117] "RemoveContainer" containerID="760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a" Oct 08 19:48:41 crc kubenswrapper[4750]: E1008 19:48:41.396347 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a\": container with ID starting with 760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a not found: ID does not exist" containerID="760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.396427 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a"} err="failed to get container status \"760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a\": rpc error: code = NotFound desc = could not find container \"760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a\": container with ID starting with 760787f2a7393e53d392188cfba8f1c3d77516f530e0b9979c7156585219ec0a not found: ID does not exist" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.396461 4750 scope.go:117] "RemoveContainer" containerID="84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973" Oct 08 19:48:41 crc kubenswrapper[4750]: E1008 19:48:41.397317 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973\": container with ID starting with 84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973 not found: ID does not exist" containerID="84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.397370 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973"} err="failed to get container status \"84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973\": rpc error: code = NotFound desc = could not find container \"84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973\": container with ID starting with 84a9507cde76a2079b8a19a1b40e6b9cd11fa97461eb63c1bdb39adae8192973 not found: ID does not exist" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.403140 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.430579 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.447957 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:41 crc kubenswrapper[4750]: E1008 19:48:41.448603 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api-log" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.448624 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api-log" Oct 08 19:48:41 crc kubenswrapper[4750]: E1008 19:48:41.448679 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.448686 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.448916 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api-log" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.448937 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" containerName="cinder-api" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.450350 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.456035 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.481023 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.498903 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/764d6df3-c738-4f73-a3a6-3b502c9052a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.498992 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.499028 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764d6df3-c738-4f73-a3a6-3b502c9052a5-logs\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.499072 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-scripts\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.499104 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.499147 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b89p\" (UniqueName: \"kubernetes.io/projected/764d6df3-c738-4f73-a3a6-3b502c9052a5-kube-api-access-5b89p\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.499234 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-config-data\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.601181 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/764d6df3-c738-4f73-a3a6-3b502c9052a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.601274 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.601304 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764d6df3-c738-4f73-a3a6-3b502c9052a5-logs\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.601405 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-scripts\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.601458 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.601522 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b89p\" (UniqueName: \"kubernetes.io/projected/764d6df3-c738-4f73-a3a6-3b502c9052a5-kube-api-access-5b89p\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.601594 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-config-data\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.602118 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/764d6df3-c738-4f73-a3a6-3b502c9052a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.602950 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764d6df3-c738-4f73-a3a6-3b502c9052a5-logs\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.609420 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.611822 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-config-data\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.612087 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-scripts\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.622259 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764d6df3-c738-4f73-a3a6-3b502c9052a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.630300 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b89p\" (UniqueName: \"kubernetes.io/projected/764d6df3-c738-4f73-a3a6-3b502c9052a5-kube-api-access-5b89p\") pod \"cinder-api-0\" (UID: \"764d6df3-c738-4f73-a3a6-3b502c9052a5\") " pod="openstack/cinder-api-0" Oct 08 19:48:41 crc kubenswrapper[4750]: I1008 19:48:41.786517 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 19:48:42 crc kubenswrapper[4750]: I1008 19:48:42.341976 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 19:48:42 crc kubenswrapper[4750]: I1008 19:48:42.749736 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f" path="/var/lib/kubelet/pods/9e0fb9a8-bed9-4ab9-a98b-b380fc71a56f/volumes" Oct 08 19:48:43 crc kubenswrapper[4750]: I1008 19:48:43.053707 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:43 crc kubenswrapper[4750]: I1008 19:48:43.335312 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"764d6df3-c738-4f73-a3a6-3b502c9052a5","Type":"ContainerStarted","Data":"d3a59f9f584606bc9b74be036a855897d003d395da67bd4d47c30fc27e51020e"} Oct 08 19:48:43 crc kubenswrapper[4750]: I1008 19:48:43.335376 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"764d6df3-c738-4f73-a3a6-3b502c9052a5","Type":"ContainerStarted","Data":"0d334bc32904d2cca8a0c115d9cc22d9c0e701236f91d457be62b144edf17180"} Oct 08 19:48:43 crc kubenswrapper[4750]: I1008 19:48:43.858178 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 08 19:48:44 crc kubenswrapper[4750]: I1008 19:48:44.349249 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"764d6df3-c738-4f73-a3a6-3b502c9052a5","Type":"ContainerStarted","Data":"5bfeebb592560b3ba3fdc8e6eeebf6244d41a72e90ebac2512e17b4fbaf357c3"} Oct 08 19:48:44 crc kubenswrapper[4750]: I1008 19:48:44.349814 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 19:48:44 crc kubenswrapper[4750]: I1008 19:48:44.387075 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.387043421 podStartE2EDuration="3.387043421s" podCreationTimestamp="2025-10-08 19:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:44.378657183 +0000 UTC m=+5880.291628206" watchObservedRunningTime="2025-10-08 19:48:44.387043421 +0000 UTC m=+5880.300014444" Oct 08 19:48:46 crc kubenswrapper[4750]: I1008 19:48:46.511742 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 19:48:46 crc kubenswrapper[4750]: I1008 19:48:46.593591 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:46 crc kubenswrapper[4750]: I1008 19:48:46.736724 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:48:46 crc kubenswrapper[4750]: E1008 19:48:46.737249 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:48:47 crc kubenswrapper[4750]: I1008 19:48:47.391082 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerName="cinder-scheduler" containerID="cri-o://5557b3fe256cbb9e4f581eb823c73c351f79c58cc04a52d12b13d1505ba9a7db" gracePeriod=30 Oct 08 19:48:47 crc kubenswrapper[4750]: I1008 19:48:47.391726 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerName="probe" containerID="cri-o://da84b1b5434fd0a47bbd6d7b42d6978e96ea105eed66c6a7f9148aa47a3136d2" gracePeriod=30 Oct 08 19:48:48 crc kubenswrapper[4750]: E1008 19:48:48.112951 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode640d7c4_5b73_40af_ad2c_96fbfebe2956.slice/crio-da84b1b5434fd0a47bbd6d7b42d6978e96ea105eed66c6a7f9148aa47a3136d2.scope\": RecentStats: unable to find data in memory cache]" Oct 08 19:48:48 crc kubenswrapper[4750]: I1008 19:48:48.300578 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 08 19:48:48 crc kubenswrapper[4750]: I1008 19:48:48.407247 4750 generic.go:334] "Generic (PLEG): container finished" podID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerID="da84b1b5434fd0a47bbd6d7b42d6978e96ea105eed66c6a7f9148aa47a3136d2" exitCode=0 Oct 08 19:48:48 crc kubenswrapper[4750]: I1008 19:48:48.407298 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e640d7c4-5b73-40af-ad2c-96fbfebe2956","Type":"ContainerDied","Data":"da84b1b5434fd0a47bbd6d7b42d6978e96ea105eed66c6a7f9148aa47a3136d2"} Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.146963 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.429531 4750 generic.go:334] "Generic (PLEG): container finished" podID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerID="5557b3fe256cbb9e4f581eb823c73c351f79c58cc04a52d12b13d1505ba9a7db" exitCode=0 Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.429716 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e640d7c4-5b73-40af-ad2c-96fbfebe2956","Type":"ContainerDied","Data":"5557b3fe256cbb9e4f581eb823c73c351f79c58cc04a52d12b13d1505ba9a7db"} Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.544816 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.601235 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b57lq\" (UniqueName: \"kubernetes.io/projected/e640d7c4-5b73-40af-ad2c-96fbfebe2956-kube-api-access-b57lq\") pod \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.601330 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-scripts\") pod \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.601371 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data\") pod \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.601532 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e640d7c4-5b73-40af-ad2c-96fbfebe2956-etc-machine-id\") pod \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.601593 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data-custom\") pod \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.601741 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-combined-ca-bundle\") pod \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\" (UID: \"e640d7c4-5b73-40af-ad2c-96fbfebe2956\") " Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.602266 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e640d7c4-5b73-40af-ad2c-96fbfebe2956-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e640d7c4-5b73-40af-ad2c-96fbfebe2956" (UID: "e640d7c4-5b73-40af-ad2c-96fbfebe2956"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.610710 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e640d7c4-5b73-40af-ad2c-96fbfebe2956" (UID: "e640d7c4-5b73-40af-ad2c-96fbfebe2956"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.627943 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e640d7c4-5b73-40af-ad2c-96fbfebe2956-kube-api-access-b57lq" (OuterVolumeSpecName: "kube-api-access-b57lq") pod "e640d7c4-5b73-40af-ad2c-96fbfebe2956" (UID: "e640d7c4-5b73-40af-ad2c-96fbfebe2956"). InnerVolumeSpecName "kube-api-access-b57lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.629306 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-scripts" (OuterVolumeSpecName: "scripts") pod "e640d7c4-5b73-40af-ad2c-96fbfebe2956" (UID: "e640d7c4-5b73-40af-ad2c-96fbfebe2956"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.687633 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e640d7c4-5b73-40af-ad2c-96fbfebe2956" (UID: "e640d7c4-5b73-40af-ad2c-96fbfebe2956"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.704631 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b57lq\" (UniqueName: \"kubernetes.io/projected/e640d7c4-5b73-40af-ad2c-96fbfebe2956-kube-api-access-b57lq\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.704680 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.704690 4750 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e640d7c4-5b73-40af-ad2c-96fbfebe2956-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.704699 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.704709 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.728507 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data" (OuterVolumeSpecName: "config-data") pod "e640d7c4-5b73-40af-ad2c-96fbfebe2956" (UID: "e640d7c4-5b73-40af-ad2c-96fbfebe2956"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:48:49 crc kubenswrapper[4750]: I1008 19:48:49.809295 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e640d7c4-5b73-40af-ad2c-96fbfebe2956-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.449864 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e640d7c4-5b73-40af-ad2c-96fbfebe2956","Type":"ContainerDied","Data":"7f24ddc65ebaabb12e7ef51aab5fe3caa79ce1fef6c369dd055a2fd55a7c62ca"} Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.449966 4750 scope.go:117] "RemoveContainer" containerID="da84b1b5434fd0a47bbd6d7b42d6978e96ea105eed66c6a7f9148aa47a3136d2" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.450033 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.497323 4750 scope.go:117] "RemoveContainer" containerID="5557b3fe256cbb9e4f581eb823c73c351f79c58cc04a52d12b13d1505ba9a7db" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.497840 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.508483 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.563442 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:50 crc kubenswrapper[4750]: E1008 19:48:50.564439 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerName="probe" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.564470 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerName="probe" Oct 08 19:48:50 crc kubenswrapper[4750]: E1008 19:48:50.564504 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerName="cinder-scheduler" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.564513 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerName="cinder-scheduler" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.564959 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerName="probe" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.565001 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" containerName="cinder-scheduler" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.567331 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.570923 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.583240 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.634703 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-scripts\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.634780 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.634902 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.634926 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rml2b\" (UniqueName: \"kubernetes.io/projected/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-kube-api-access-rml2b\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.634954 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-config-data\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.634971 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.736786 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-scripts\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.736880 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.736985 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.737007 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rml2b\" (UniqueName: \"kubernetes.io/projected/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-kube-api-access-rml2b\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.737034 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-config-data\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.737050 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.737147 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.748919 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-config-data\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.748930 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.749799 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e640d7c4-5b73-40af-ad2c-96fbfebe2956" path="/var/lib/kubelet/pods/e640d7c4-5b73-40af-ad2c-96fbfebe2956/volumes" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.749876 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.750375 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-scripts\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.755035 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rml2b\" (UniqueName: \"kubernetes.io/projected/e4f6bc2c-1192-4320-b0fe-97b8853a36b4-kube-api-access-rml2b\") pod \"cinder-scheduler-0\" (UID: \"e4f6bc2c-1192-4320-b0fe-97b8853a36b4\") " pod="openstack/cinder-scheduler-0" Oct 08 19:48:50 crc kubenswrapper[4750]: I1008 19:48:50.901840 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 19:48:51 crc kubenswrapper[4750]: I1008 19:48:51.494097 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 19:48:52 crc kubenswrapper[4750]: I1008 19:48:52.478009 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e4f6bc2c-1192-4320-b0fe-97b8853a36b4","Type":"ContainerStarted","Data":"b57d74d5fba38be7ad0825e9375a820280eb47fbc951651586e9f76963451823"} Oct 08 19:48:52 crc kubenswrapper[4750]: I1008 19:48:52.479031 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e4f6bc2c-1192-4320-b0fe-97b8853a36b4","Type":"ContainerStarted","Data":"7e894041c3a02f06a1a13cfc3954df3a16a0acf26d18af6ba546a5446c1538c5"} Oct 08 19:48:53 crc kubenswrapper[4750]: I1008 19:48:53.490421 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e4f6bc2c-1192-4320-b0fe-97b8853a36b4","Type":"ContainerStarted","Data":"85d90027c76693c6a222707c5414775d6e110dd2f4b6a0742179c32abf5e03c3"} Oct 08 19:48:53 crc kubenswrapper[4750]: I1008 19:48:53.523344 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.523319843 podStartE2EDuration="3.523319843s" podCreationTimestamp="2025-10-08 19:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:48:53.517673053 +0000 UTC m=+5889.430644086" watchObservedRunningTime="2025-10-08 19:48:53.523319843 +0000 UTC m=+5889.436290856" Oct 08 19:48:54 crc kubenswrapper[4750]: I1008 19:48:54.040998 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 19:48:55 crc kubenswrapper[4750]: I1008 19:48:55.902023 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 19:48:57 crc kubenswrapper[4750]: I1008 19:48:57.735125 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:48:57 crc kubenswrapper[4750]: E1008 19:48:57.735947 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:49:01 crc kubenswrapper[4750]: I1008 19:49:01.183430 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 19:49:04 crc kubenswrapper[4750]: I1008 19:49:04.075642 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8snxc"] Oct 08 19:49:04 crc kubenswrapper[4750]: I1008 19:49:04.087605 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8snxc"] Oct 08 19:49:04 crc kubenswrapper[4750]: I1008 19:49:04.326320 4750 scope.go:117] "RemoveContainer" containerID="522b3161de3b7c218b9e4cf7c883e0256eab0b803c7415d22f4513858a819244" Oct 08 19:49:04 crc kubenswrapper[4750]: I1008 19:49:04.368838 4750 scope.go:117] "RemoveContainer" containerID="c53045eca0cd6eb7ca4b9404c791b4c8077a2d1b1598174f5d2a930cedd8c68b" Oct 08 19:49:04 crc kubenswrapper[4750]: I1008 19:49:04.762479 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f042f49a-6626-4f29-8f84-2da116657330" path="/var/lib/kubelet/pods/f042f49a-6626-4f29-8f84-2da116657330/volumes" Oct 08 19:49:11 crc kubenswrapper[4750]: I1008 19:49:11.734673 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:49:11 crc kubenswrapper[4750]: E1008 19:49:11.735789 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:49:15 crc kubenswrapper[4750]: I1008 19:49:15.047219 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-269e-account-create-8kk82"] Oct 08 19:49:15 crc kubenswrapper[4750]: I1008 19:49:15.063363 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-269e-account-create-8kk82"] Oct 08 19:49:16 crc kubenswrapper[4750]: I1008 19:49:16.752867 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db277cda-d138-4a8e-a30c-b767ce163d5b" path="/var/lib/kubelet/pods/db277cda-d138-4a8e-a30c-b767ce163d5b/volumes" Oct 08 19:49:21 crc kubenswrapper[4750]: I1008 19:49:21.043340 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-d2hls"] Oct 08 19:49:21 crc kubenswrapper[4750]: I1008 19:49:21.054936 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-d2hls"] Oct 08 19:49:22 crc kubenswrapper[4750]: I1008 19:49:22.759872 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e171e3a-c2b0-4b44-8a1b-7d345e7e9545" path="/var/lib/kubelet/pods/9e171e3a-c2b0-4b44-8a1b-7d345e7e9545/volumes" Oct 08 19:49:23 crc kubenswrapper[4750]: I1008 19:49:23.734659 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:49:23 crc kubenswrapper[4750]: E1008 19:49:23.735099 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:49:35 crc kubenswrapper[4750]: I1008 19:49:35.067579 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fv7rm"] Oct 08 19:49:35 crc kubenswrapper[4750]: I1008 19:49:35.076148 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fv7rm"] Oct 08 19:49:35 crc kubenswrapper[4750]: I1008 19:49:35.735065 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:49:35 crc kubenswrapper[4750]: E1008 19:49:35.735477 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:49:36 crc kubenswrapper[4750]: I1008 19:49:36.750802 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="347b7991-7079-420d-a1a8-1506d1a3ff01" path="/var/lib/kubelet/pods/347b7991-7079-420d-a1a8-1506d1a3ff01/volumes" Oct 08 19:49:47 crc kubenswrapper[4750]: I1008 19:49:47.735581 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:49:47 crc kubenswrapper[4750]: E1008 19:49:47.737342 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:49:58 crc kubenswrapper[4750]: I1008 19:49:58.734973 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:49:58 crc kubenswrapper[4750]: E1008 19:49:58.736187 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:50:04 crc kubenswrapper[4750]: I1008 19:50:04.626697 4750 scope.go:117] "RemoveContainer" containerID="5eaa609b6cd6658b2111b2d08a4af90b0ad1f2769220807377eaddc6bbb500f3" Oct 08 19:50:04 crc kubenswrapper[4750]: I1008 19:50:04.674343 4750 scope.go:117] "RemoveContainer" containerID="5e78fa5b8a46197551b717a36e1a9a9846aec41994aa4b310cb064fb3d3ddb31" Oct 08 19:50:04 crc kubenswrapper[4750]: I1008 19:50:04.720202 4750 scope.go:117] "RemoveContainer" containerID="134dec9c46b49f238cea8b903cbf59e9f6f06bb1e97ba812b4bcfbb71f17ea24" Oct 08 19:50:04 crc kubenswrapper[4750]: I1008 19:50:04.775662 4750 scope.go:117] "RemoveContainer" containerID="e6d6f0860428ad7d7170a7ebca0a3d6260c7fdbee74938172fda4bfb0352625d" Oct 08 19:50:13 crc kubenswrapper[4750]: I1008 19:50:13.734934 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:50:13 crc kubenswrapper[4750]: E1008 19:50:13.736702 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:50:28 crc kubenswrapper[4750]: I1008 19:50:28.745016 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:50:28 crc kubenswrapper[4750]: E1008 19:50:28.746589 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:50:39 crc kubenswrapper[4750]: I1008 19:50:39.734496 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:50:39 crc kubenswrapper[4750]: E1008 19:50:39.735456 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.367686 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sxhxx"] Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.370143 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.372334 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.373479 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8mlxk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.384929 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sxhxx"] Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.420622 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-kwprk"] Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.422816 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.435592 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kwprk"] Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.551939 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-log-ovn\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.551995 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-log\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552030 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-scripts\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552060 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrf8\" (UniqueName: \"kubernetes.io/projected/dba47bae-b4f2-411f-9305-9c5e52fc5213-kube-api-access-ktrf8\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552079 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dba47bae-b4f2-411f-9305-9c5e52fc5213-scripts\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552097 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkdvz\" (UniqueName: \"kubernetes.io/projected/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-kube-api-access-fkdvz\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552114 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-run\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552150 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-run-ovn\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552183 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-lib\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552231 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-run\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.552256 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-etc-ovs\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.654166 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-run\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.654540 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-etc-ovs\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.654703 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-log-ovn\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.654763 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-etc-ovs\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.654790 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-log\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.654925 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-scripts\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655014 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrf8\" (UniqueName: \"kubernetes.io/projected/dba47bae-b4f2-411f-9305-9c5e52fc5213-kube-api-access-ktrf8\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655044 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dba47bae-b4f2-411f-9305-9c5e52fc5213-scripts\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655073 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkdvz\" (UniqueName: \"kubernetes.io/projected/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-kube-api-access-fkdvz\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655098 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-run\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655196 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-run-ovn\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655272 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-log\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655332 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-lib\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655280 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-lib\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.654574 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-run\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.655772 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-log-ovn\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.656131 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-var-run\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.656652 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dba47bae-b4f2-411f-9305-9c5e52fc5213-var-run-ovn\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.657396 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-scripts\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.658111 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dba47bae-b4f2-411f-9305-9c5e52fc5213-scripts\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.694714 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrf8\" (UniqueName: \"kubernetes.io/projected/dba47bae-b4f2-411f-9305-9c5e52fc5213-kube-api-access-ktrf8\") pod \"ovn-controller-sxhxx\" (UID: \"dba47bae-b4f2-411f-9305-9c5e52fc5213\") " pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.695346 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.715179 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkdvz\" (UniqueName: \"kubernetes.io/projected/9bd109d1-56e5-49b4-884f-2eb99d8a72f9-kube-api-access-fkdvz\") pod \"ovn-controller-ovs-kwprk\" (UID: \"9bd109d1-56e5-49b4-884f-2eb99d8a72f9\") " pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:44 crc kubenswrapper[4750]: I1008 19:50:44.742363 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.317892 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sxhxx"] Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.783858 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kwprk"] Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.891718 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ds86v"] Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.915876 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ds86v"] Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.916099 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.927990 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.953788 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sxhxx" event={"ID":"dba47bae-b4f2-411f-9305-9c5e52fc5213","Type":"ContainerStarted","Data":"5d00fd7fb470ecb6e50deb21d59bb97a0a56fb4cc527b3bcba2bd69de8fe45d0"} Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.953843 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sxhxx" event={"ID":"dba47bae-b4f2-411f-9305-9c5e52fc5213","Type":"ContainerStarted","Data":"f5c61e9219c58ea6fc6a1f25157a1684797f2b472cce17e4045468a84d070009"} Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.954817 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-sxhxx" Oct 08 19:50:45 crc kubenswrapper[4750]: I1008 19:50:45.959825 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kwprk" event={"ID":"9bd109d1-56e5-49b4-884f-2eb99d8a72f9","Type":"ContainerStarted","Data":"67f2d1654e5909e46800b6f8599cffdadd48acb965790871e6ca4baf0bbca7f3"} Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.105344 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-ovn-rundir\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.105510 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-config\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.105583 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-ovs-rundir\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.105694 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtwxc\" (UniqueName: \"kubernetes.io/projected/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-kube-api-access-rtwxc\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.208536 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwxc\" (UniqueName: \"kubernetes.io/projected/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-kube-api-access-rtwxc\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.209658 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-ovn-rundir\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.209711 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-config\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.209738 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-ovs-rundir\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.210129 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-ovs-rundir\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.210184 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-ovn-rundir\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.211010 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-config\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.230321 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtwxc\" (UniqueName: \"kubernetes.io/projected/5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4-kube-api-access-rtwxc\") pod \"ovn-controller-metrics-ds86v\" (UID: \"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4\") " pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.260511 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ds86v" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.776782 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sxhxx" podStartSLOduration=2.776755424 podStartE2EDuration="2.776755424s" podCreationTimestamp="2025-10-08 19:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:50:45.984974275 +0000 UTC m=+6001.897945288" watchObservedRunningTime="2025-10-08 19:50:46.776755424 +0000 UTC m=+6002.689726447" Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.779360 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ds86v"] Oct 08 19:50:46 crc kubenswrapper[4750]: W1008 19:50:46.784404 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cd8ba2c_c86d_48e9_a94d_a6a4555de8c4.slice/crio-724bce521ab59fa826774b42860b7438616bb80a21748ae0e70eda4b356881bb WatchSource:0}: Error finding container 724bce521ab59fa826774b42860b7438616bb80a21748ae0e70eda4b356881bb: Status 404 returned error can't find the container with id 724bce521ab59fa826774b42860b7438616bb80a21748ae0e70eda4b356881bb Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.985269 4750 generic.go:334] "Generic (PLEG): container finished" podID="9bd109d1-56e5-49b4-884f-2eb99d8a72f9" containerID="0dedaff9ea0d2acb4d8ab246f3d96ddf1559803984138282394523934585429d" exitCode=0 Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.985436 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kwprk" event={"ID":"9bd109d1-56e5-49b4-884f-2eb99d8a72f9","Type":"ContainerDied","Data":"0dedaff9ea0d2acb4d8ab246f3d96ddf1559803984138282394523934585429d"} Oct 08 19:50:46 crc kubenswrapper[4750]: I1008 19:50:46.990448 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ds86v" event={"ID":"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4","Type":"ContainerStarted","Data":"724bce521ab59fa826774b42860b7438616bb80a21748ae0e70eda4b356881bb"} Oct 08 19:50:47 crc kubenswrapper[4750]: I1008 19:50:47.029301 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-lhk2f"] Oct 08 19:50:47 crc kubenswrapper[4750]: I1008 19:50:47.034815 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-lhk2f" Oct 08 19:50:47 crc kubenswrapper[4750]: I1008 19:50:47.047559 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-lhk2f"] Oct 08 19:50:47 crc kubenswrapper[4750]: I1008 19:50:47.131412 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dt79\" (UniqueName: \"kubernetes.io/projected/90b6efbc-634f-4414-ac4f-dbd04a75c31e-kube-api-access-6dt79\") pod \"octavia-db-create-lhk2f\" (UID: \"90b6efbc-634f-4414-ac4f-dbd04a75c31e\") " pod="openstack/octavia-db-create-lhk2f" Oct 08 19:50:47 crc kubenswrapper[4750]: I1008 19:50:47.234446 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dt79\" (UniqueName: \"kubernetes.io/projected/90b6efbc-634f-4414-ac4f-dbd04a75c31e-kube-api-access-6dt79\") pod \"octavia-db-create-lhk2f\" (UID: \"90b6efbc-634f-4414-ac4f-dbd04a75c31e\") " pod="openstack/octavia-db-create-lhk2f" Oct 08 19:50:47 crc kubenswrapper[4750]: I1008 19:50:47.258950 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dt79\" (UniqueName: \"kubernetes.io/projected/90b6efbc-634f-4414-ac4f-dbd04a75c31e-kube-api-access-6dt79\") pod \"octavia-db-create-lhk2f\" (UID: \"90b6efbc-634f-4414-ac4f-dbd04a75c31e\") " pod="openstack/octavia-db-create-lhk2f" Oct 08 19:50:47 crc kubenswrapper[4750]: I1008 19:50:47.362836 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-lhk2f" Oct 08 19:50:47 crc kubenswrapper[4750]: I1008 19:50:47.878134 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-lhk2f"] Oct 08 19:50:47 crc kubenswrapper[4750]: W1008 19:50:47.903455 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90b6efbc_634f_4414_ac4f_dbd04a75c31e.slice/crio-b376ff32ae682fb7c6fd2d2b18fef086e0dec3bbade15ad10f1528fc43f4f902 WatchSource:0}: Error finding container b376ff32ae682fb7c6fd2d2b18fef086e0dec3bbade15ad10f1528fc43f4f902: Status 404 returned error can't find the container with id b376ff32ae682fb7c6fd2d2b18fef086e0dec3bbade15ad10f1528fc43f4f902 Oct 08 19:50:48 crc kubenswrapper[4750]: I1008 19:50:48.007035 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-lhk2f" event={"ID":"90b6efbc-634f-4414-ac4f-dbd04a75c31e","Type":"ContainerStarted","Data":"b376ff32ae682fb7c6fd2d2b18fef086e0dec3bbade15ad10f1528fc43f4f902"} Oct 08 19:50:48 crc kubenswrapper[4750]: I1008 19:50:48.010525 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kwprk" event={"ID":"9bd109d1-56e5-49b4-884f-2eb99d8a72f9","Type":"ContainerStarted","Data":"780f4285be4495b5a032ede506d9c2786a83098fd97ace905814d02f2dd62415"} Oct 08 19:50:48 crc kubenswrapper[4750]: I1008 19:50:48.010583 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kwprk" event={"ID":"9bd109d1-56e5-49b4-884f-2eb99d8a72f9","Type":"ContainerStarted","Data":"29e06829eecd521dfb92666eca75e22539e35a64f53f97c114b62ed374d86d82"} Oct 08 19:50:48 crc kubenswrapper[4750]: I1008 19:50:48.011305 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:48 crc kubenswrapper[4750]: I1008 19:50:48.011329 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:50:48 crc kubenswrapper[4750]: I1008 19:50:48.013183 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ds86v" event={"ID":"5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4","Type":"ContainerStarted","Data":"6d2dbc2086224314cee62528ca78aa76336f6272835ec4b167a8c4d5c9b77032"} Oct 08 19:50:48 crc kubenswrapper[4750]: I1008 19:50:48.040935 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-kwprk" podStartSLOduration=4.040912829 podStartE2EDuration="4.040912829s" podCreationTimestamp="2025-10-08 19:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:50:48.034183291 +0000 UTC m=+6003.947154324" watchObservedRunningTime="2025-10-08 19:50:48.040912829 +0000 UTC m=+6003.953883832" Oct 08 19:50:48 crc kubenswrapper[4750]: I1008 19:50:48.060312 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ds86v" podStartSLOduration=3.060292501 podStartE2EDuration="3.060292501s" podCreationTimestamp="2025-10-08 19:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:50:48.056871625 +0000 UTC m=+6003.969842638" watchObservedRunningTime="2025-10-08 19:50:48.060292501 +0000 UTC m=+6003.973263514" Oct 08 19:50:49 crc kubenswrapper[4750]: I1008 19:50:49.031592 4750 generic.go:334] "Generic (PLEG): container finished" podID="90b6efbc-634f-4414-ac4f-dbd04a75c31e" containerID="c6ea7bf1e99e5d9a28419c90530b17bcce11497286fda1e5026545664bc288fd" exitCode=0 Oct 08 19:50:49 crc kubenswrapper[4750]: I1008 19:50:49.031744 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-lhk2f" event={"ID":"90b6efbc-634f-4414-ac4f-dbd04a75c31e","Type":"ContainerDied","Data":"c6ea7bf1e99e5d9a28419c90530b17bcce11497286fda1e5026545664bc288fd"} Oct 08 19:50:50 crc kubenswrapper[4750]: I1008 19:50:50.448713 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-lhk2f" Oct 08 19:50:50 crc kubenswrapper[4750]: I1008 19:50:50.614106 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dt79\" (UniqueName: \"kubernetes.io/projected/90b6efbc-634f-4414-ac4f-dbd04a75c31e-kube-api-access-6dt79\") pod \"90b6efbc-634f-4414-ac4f-dbd04a75c31e\" (UID: \"90b6efbc-634f-4414-ac4f-dbd04a75c31e\") " Oct 08 19:50:50 crc kubenswrapper[4750]: I1008 19:50:50.621166 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b6efbc-634f-4414-ac4f-dbd04a75c31e-kube-api-access-6dt79" (OuterVolumeSpecName: "kube-api-access-6dt79") pod "90b6efbc-634f-4414-ac4f-dbd04a75c31e" (UID: "90b6efbc-634f-4414-ac4f-dbd04a75c31e"). InnerVolumeSpecName "kube-api-access-6dt79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:50:50 crc kubenswrapper[4750]: I1008 19:50:50.718366 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dt79\" (UniqueName: \"kubernetes.io/projected/90b6efbc-634f-4414-ac4f-dbd04a75c31e-kube-api-access-6dt79\") on node \"crc\" DevicePath \"\"" Oct 08 19:50:50 crc kubenswrapper[4750]: I1008 19:50:50.734389 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:50:50 crc kubenswrapper[4750]: E1008 19:50:50.734754 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:50:51 crc kubenswrapper[4750]: I1008 19:50:51.064674 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-lhk2f" event={"ID":"90b6efbc-634f-4414-ac4f-dbd04a75c31e","Type":"ContainerDied","Data":"b376ff32ae682fb7c6fd2d2b18fef086e0dec3bbade15ad10f1528fc43f4f902"} Oct 08 19:50:51 crc kubenswrapper[4750]: I1008 19:50:51.064740 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b376ff32ae682fb7c6fd2d2b18fef086e0dec3bbade15ad10f1528fc43f4f902" Oct 08 19:50:51 crc kubenswrapper[4750]: I1008 19:50:51.064824 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-lhk2f" Oct 08 19:50:58 crc kubenswrapper[4750]: I1008 19:50:58.966322 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-9f34-account-create-l8vfk"] Oct 08 19:50:58 crc kubenswrapper[4750]: E1008 19:50:58.967704 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b6efbc-634f-4414-ac4f-dbd04a75c31e" containerName="mariadb-database-create" Oct 08 19:50:58 crc kubenswrapper[4750]: I1008 19:50:58.967727 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b6efbc-634f-4414-ac4f-dbd04a75c31e" containerName="mariadb-database-create" Oct 08 19:50:58 crc kubenswrapper[4750]: I1008 19:50:58.967945 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b6efbc-634f-4414-ac4f-dbd04a75c31e" containerName="mariadb-database-create" Oct 08 19:50:58 crc kubenswrapper[4750]: I1008 19:50:58.968770 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9f34-account-create-l8vfk" Oct 08 19:50:58 crc kubenswrapper[4750]: I1008 19:50:58.971626 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 08 19:50:58 crc kubenswrapper[4750]: I1008 19:50:58.978284 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9f34-account-create-l8vfk"] Oct 08 19:50:59 crc kubenswrapper[4750]: I1008 19:50:59.141354 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td46q\" (UniqueName: \"kubernetes.io/projected/00a4a453-d3de-47f6-84e7-176daf84a18a-kube-api-access-td46q\") pod \"octavia-9f34-account-create-l8vfk\" (UID: \"00a4a453-d3de-47f6-84e7-176daf84a18a\") " pod="openstack/octavia-9f34-account-create-l8vfk" Oct 08 19:50:59 crc kubenswrapper[4750]: I1008 19:50:59.243939 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td46q\" (UniqueName: \"kubernetes.io/projected/00a4a453-d3de-47f6-84e7-176daf84a18a-kube-api-access-td46q\") pod \"octavia-9f34-account-create-l8vfk\" (UID: \"00a4a453-d3de-47f6-84e7-176daf84a18a\") " pod="openstack/octavia-9f34-account-create-l8vfk" Oct 08 19:50:59 crc kubenswrapper[4750]: I1008 19:50:59.269464 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td46q\" (UniqueName: \"kubernetes.io/projected/00a4a453-d3de-47f6-84e7-176daf84a18a-kube-api-access-td46q\") pod \"octavia-9f34-account-create-l8vfk\" (UID: \"00a4a453-d3de-47f6-84e7-176daf84a18a\") " pod="openstack/octavia-9f34-account-create-l8vfk" Oct 08 19:50:59 crc kubenswrapper[4750]: I1008 19:50:59.309208 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9f34-account-create-l8vfk" Oct 08 19:50:59 crc kubenswrapper[4750]: I1008 19:50:59.809173 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9f34-account-create-l8vfk"] Oct 08 19:50:59 crc kubenswrapper[4750]: W1008 19:50:59.822627 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00a4a453_d3de_47f6_84e7_176daf84a18a.slice/crio-2e2e578055ba4dc8c7140c0a1cde9944ba392e2f8fbebf3b6a078204307ccda3 WatchSource:0}: Error finding container 2e2e578055ba4dc8c7140c0a1cde9944ba392e2f8fbebf3b6a078204307ccda3: Status 404 returned error can't find the container with id 2e2e578055ba4dc8c7140c0a1cde9944ba392e2f8fbebf3b6a078204307ccda3 Oct 08 19:51:00 crc kubenswrapper[4750]: I1008 19:51:00.174824 4750 generic.go:334] "Generic (PLEG): container finished" podID="00a4a453-d3de-47f6-84e7-176daf84a18a" containerID="e49432ff7a9987b4e272e261789289b911e409afb19f4a12c4e8c3016d46be67" exitCode=0 Oct 08 19:51:00 crc kubenswrapper[4750]: I1008 19:51:00.174957 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9f34-account-create-l8vfk" event={"ID":"00a4a453-d3de-47f6-84e7-176daf84a18a","Type":"ContainerDied","Data":"e49432ff7a9987b4e272e261789289b911e409afb19f4a12c4e8c3016d46be67"} Oct 08 19:51:00 crc kubenswrapper[4750]: I1008 19:51:00.175404 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9f34-account-create-l8vfk" event={"ID":"00a4a453-d3de-47f6-84e7-176daf84a18a","Type":"ContainerStarted","Data":"2e2e578055ba4dc8c7140c0a1cde9944ba392e2f8fbebf3b6a078204307ccda3"} Oct 08 19:51:01 crc kubenswrapper[4750]: I1008 19:51:01.542284 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9f34-account-create-l8vfk" Oct 08 19:51:01 crc kubenswrapper[4750]: I1008 19:51:01.703679 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td46q\" (UniqueName: \"kubernetes.io/projected/00a4a453-d3de-47f6-84e7-176daf84a18a-kube-api-access-td46q\") pod \"00a4a453-d3de-47f6-84e7-176daf84a18a\" (UID: \"00a4a453-d3de-47f6-84e7-176daf84a18a\") " Oct 08 19:51:01 crc kubenswrapper[4750]: I1008 19:51:01.714639 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a4a453-d3de-47f6-84e7-176daf84a18a-kube-api-access-td46q" (OuterVolumeSpecName: "kube-api-access-td46q") pod "00a4a453-d3de-47f6-84e7-176daf84a18a" (UID: "00a4a453-d3de-47f6-84e7-176daf84a18a"). InnerVolumeSpecName "kube-api-access-td46q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:51:01 crc kubenswrapper[4750]: I1008 19:51:01.735179 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:51:01 crc kubenswrapper[4750]: E1008 19:51:01.735652 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:51:01 crc kubenswrapper[4750]: I1008 19:51:01.806375 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td46q\" (UniqueName: \"kubernetes.io/projected/00a4a453-d3de-47f6-84e7-176daf84a18a-kube-api-access-td46q\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:02 crc kubenswrapper[4750]: I1008 19:51:02.204927 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9f34-account-create-l8vfk" event={"ID":"00a4a453-d3de-47f6-84e7-176daf84a18a","Type":"ContainerDied","Data":"2e2e578055ba4dc8c7140c0a1cde9944ba392e2f8fbebf3b6a078204307ccda3"} Oct 08 19:51:02 crc kubenswrapper[4750]: I1008 19:51:02.205492 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e2e578055ba4dc8c7140c0a1cde9944ba392e2f8fbebf3b6a078204307ccda3" Oct 08 19:51:02 crc kubenswrapper[4750]: I1008 19:51:02.205006 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9f34-account-create-l8vfk" Oct 08 19:51:05 crc kubenswrapper[4750]: I1008 19:51:05.704288 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-jnvxp"] Oct 08 19:51:05 crc kubenswrapper[4750]: E1008 19:51:05.705320 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4a453-d3de-47f6-84e7-176daf84a18a" containerName="mariadb-account-create" Oct 08 19:51:05 crc kubenswrapper[4750]: I1008 19:51:05.705340 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4a453-d3de-47f6-84e7-176daf84a18a" containerName="mariadb-account-create" Oct 08 19:51:05 crc kubenswrapper[4750]: I1008 19:51:05.705623 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a4a453-d3de-47f6-84e7-176daf84a18a" containerName="mariadb-account-create" Oct 08 19:51:05 crc kubenswrapper[4750]: I1008 19:51:05.706629 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jnvxp" Oct 08 19:51:05 crc kubenswrapper[4750]: I1008 19:51:05.748713 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-jnvxp"] Oct 08 19:51:05 crc kubenswrapper[4750]: I1008 19:51:05.820841 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrf2g\" (UniqueName: \"kubernetes.io/projected/1d531061-ecfd-4e0b-b703-4354517d7cec-kube-api-access-wrf2g\") pod \"octavia-persistence-db-create-jnvxp\" (UID: \"1d531061-ecfd-4e0b-b703-4354517d7cec\") " pod="openstack/octavia-persistence-db-create-jnvxp" Oct 08 19:51:05 crc kubenswrapper[4750]: I1008 19:51:05.924350 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrf2g\" (UniqueName: \"kubernetes.io/projected/1d531061-ecfd-4e0b-b703-4354517d7cec-kube-api-access-wrf2g\") pod \"octavia-persistence-db-create-jnvxp\" (UID: \"1d531061-ecfd-4e0b-b703-4354517d7cec\") " pod="openstack/octavia-persistence-db-create-jnvxp" Oct 08 19:51:05 crc kubenswrapper[4750]: I1008 19:51:05.949987 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrf2g\" (UniqueName: \"kubernetes.io/projected/1d531061-ecfd-4e0b-b703-4354517d7cec-kube-api-access-wrf2g\") pod \"octavia-persistence-db-create-jnvxp\" (UID: \"1d531061-ecfd-4e0b-b703-4354517d7cec\") " pod="openstack/octavia-persistence-db-create-jnvxp" Oct 08 19:51:06 crc kubenswrapper[4750]: I1008 19:51:06.040334 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jnvxp" Oct 08 19:51:06 crc kubenswrapper[4750]: W1008 19:51:06.631156 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d531061_ecfd_4e0b_b703_4354517d7cec.slice/crio-85aeaf6c46b1463e2fc897bfca4de00bc9f8980986c48a45fc60afab4f8725ef WatchSource:0}: Error finding container 85aeaf6c46b1463e2fc897bfca4de00bc9f8980986c48a45fc60afab4f8725ef: Status 404 returned error can't find the container with id 85aeaf6c46b1463e2fc897bfca4de00bc9f8980986c48a45fc60afab4f8725ef Oct 08 19:51:06 crc kubenswrapper[4750]: I1008 19:51:06.633273 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-jnvxp"] Oct 08 19:51:07 crc kubenswrapper[4750]: I1008 19:51:07.265994 4750 generic.go:334] "Generic (PLEG): container finished" podID="1d531061-ecfd-4e0b-b703-4354517d7cec" containerID="d93d5c7d187b7a4aa045a71dbb2e45491d77cb51aa51eec6d30c58e4e64ff9b2" exitCode=0 Oct 08 19:51:07 crc kubenswrapper[4750]: I1008 19:51:07.266067 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jnvxp" event={"ID":"1d531061-ecfd-4e0b-b703-4354517d7cec","Type":"ContainerDied","Data":"d93d5c7d187b7a4aa045a71dbb2e45491d77cb51aa51eec6d30c58e4e64ff9b2"} Oct 08 19:51:07 crc kubenswrapper[4750]: I1008 19:51:07.266543 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jnvxp" event={"ID":"1d531061-ecfd-4e0b-b703-4354517d7cec","Type":"ContainerStarted","Data":"85aeaf6c46b1463e2fc897bfca4de00bc9f8980986c48a45fc60afab4f8725ef"} Oct 08 19:51:08 crc kubenswrapper[4750]: I1008 19:51:08.668937 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jnvxp" Oct 08 19:51:08 crc kubenswrapper[4750]: I1008 19:51:08.790077 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrf2g\" (UniqueName: \"kubernetes.io/projected/1d531061-ecfd-4e0b-b703-4354517d7cec-kube-api-access-wrf2g\") pod \"1d531061-ecfd-4e0b-b703-4354517d7cec\" (UID: \"1d531061-ecfd-4e0b-b703-4354517d7cec\") " Oct 08 19:51:08 crc kubenswrapper[4750]: I1008 19:51:08.802401 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d531061-ecfd-4e0b-b703-4354517d7cec-kube-api-access-wrf2g" (OuterVolumeSpecName: "kube-api-access-wrf2g") pod "1d531061-ecfd-4e0b-b703-4354517d7cec" (UID: "1d531061-ecfd-4e0b-b703-4354517d7cec"). InnerVolumeSpecName "kube-api-access-wrf2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:51:08 crc kubenswrapper[4750]: I1008 19:51:08.893522 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrf2g\" (UniqueName: \"kubernetes.io/projected/1d531061-ecfd-4e0b-b703-4354517d7cec-kube-api-access-wrf2g\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:09 crc kubenswrapper[4750]: I1008 19:51:09.294815 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jnvxp" event={"ID":"1d531061-ecfd-4e0b-b703-4354517d7cec","Type":"ContainerDied","Data":"85aeaf6c46b1463e2fc897bfca4de00bc9f8980986c48a45fc60afab4f8725ef"} Oct 08 19:51:09 crc kubenswrapper[4750]: I1008 19:51:09.294886 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85aeaf6c46b1463e2fc897bfca4de00bc9f8980986c48a45fc60afab4f8725ef" Oct 08 19:51:09 crc kubenswrapper[4750]: I1008 19:51:09.294907 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jnvxp" Oct 08 19:51:13 crc kubenswrapper[4750]: I1008 19:51:13.734361 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:51:13 crc kubenswrapper[4750]: E1008 19:51:13.735945 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.005669 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-cf34-account-create-nbmf6"] Oct 08 19:51:17 crc kubenswrapper[4750]: E1008 19:51:17.007056 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d531061-ecfd-4e0b-b703-4354517d7cec" containerName="mariadb-database-create" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.007077 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d531061-ecfd-4e0b-b703-4354517d7cec" containerName="mariadb-database-create" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.007414 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d531061-ecfd-4e0b-b703-4354517d7cec" containerName="mariadb-database-create" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.008583 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cf34-account-create-nbmf6" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.013443 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.016347 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-cf34-account-create-nbmf6"] Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.092044 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84k56\" (UniqueName: \"kubernetes.io/projected/7c54a1b6-2888-48e7-86d3-7e25103e7a6e-kube-api-access-84k56\") pod \"octavia-cf34-account-create-nbmf6\" (UID: \"7c54a1b6-2888-48e7-86d3-7e25103e7a6e\") " pod="openstack/octavia-cf34-account-create-nbmf6" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.194651 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84k56\" (UniqueName: \"kubernetes.io/projected/7c54a1b6-2888-48e7-86d3-7e25103e7a6e-kube-api-access-84k56\") pod \"octavia-cf34-account-create-nbmf6\" (UID: \"7c54a1b6-2888-48e7-86d3-7e25103e7a6e\") " pod="openstack/octavia-cf34-account-create-nbmf6" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.230267 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84k56\" (UniqueName: \"kubernetes.io/projected/7c54a1b6-2888-48e7-86d3-7e25103e7a6e-kube-api-access-84k56\") pod \"octavia-cf34-account-create-nbmf6\" (UID: \"7c54a1b6-2888-48e7-86d3-7e25103e7a6e\") " pod="openstack/octavia-cf34-account-create-nbmf6" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.346936 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cf34-account-create-nbmf6" Oct 08 19:51:17 crc kubenswrapper[4750]: I1008 19:51:17.878082 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-cf34-account-create-nbmf6"] Oct 08 19:51:18 crc kubenswrapper[4750]: I1008 19:51:18.454155 4750 generic.go:334] "Generic (PLEG): container finished" podID="7c54a1b6-2888-48e7-86d3-7e25103e7a6e" containerID="6b3c4c24f9072637cf10b61ba6d51475516c22664627b10151f4ffc18dab5dce" exitCode=0 Oct 08 19:51:18 crc kubenswrapper[4750]: I1008 19:51:18.454211 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cf34-account-create-nbmf6" event={"ID":"7c54a1b6-2888-48e7-86d3-7e25103e7a6e","Type":"ContainerDied","Data":"6b3c4c24f9072637cf10b61ba6d51475516c22664627b10151f4ffc18dab5dce"} Oct 08 19:51:18 crc kubenswrapper[4750]: I1008 19:51:18.454243 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cf34-account-create-nbmf6" event={"ID":"7c54a1b6-2888-48e7-86d3-7e25103e7a6e","Type":"ContainerStarted","Data":"23f3e12ad2898f1121d97a7c8bce7f487460f55db7089d9dca62d49c52910576"} Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.746179 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sxhxx" podUID="dba47bae-b4f2-411f-9305-9c5e52fc5213" containerName="ovn-controller" probeResult="failure" output=< Oct 08 19:51:19 crc kubenswrapper[4750]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 08 19:51:19 crc kubenswrapper[4750]: > Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.792251 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.801308 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kwprk" Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.931316 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cf34-account-create-nbmf6" Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.941875 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sxhxx-config-45wwk"] Oct 08 19:51:19 crc kubenswrapper[4750]: E1008 19:51:19.943357 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c54a1b6-2888-48e7-86d3-7e25103e7a6e" containerName="mariadb-account-create" Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.943390 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c54a1b6-2888-48e7-86d3-7e25103e7a6e" containerName="mariadb-account-create" Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.945948 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c54a1b6-2888-48e7-86d3-7e25103e7a6e" containerName="mariadb-account-create" Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.946986 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.950308 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.953122 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sxhxx-config-45wwk"] Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.958980 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84k56\" (UniqueName: \"kubernetes.io/projected/7c54a1b6-2888-48e7-86d3-7e25103e7a6e-kube-api-access-84k56\") pod \"7c54a1b6-2888-48e7-86d3-7e25103e7a6e\" (UID: \"7c54a1b6-2888-48e7-86d3-7e25103e7a6e\") " Oct 08 19:51:19 crc kubenswrapper[4750]: I1008 19:51:19.966453 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c54a1b6-2888-48e7-86d3-7e25103e7a6e-kube-api-access-84k56" (OuterVolumeSpecName: "kube-api-access-84k56") pod "7c54a1b6-2888-48e7-86d3-7e25103e7a6e" (UID: "7c54a1b6-2888-48e7-86d3-7e25103e7a6e"). InnerVolumeSpecName "kube-api-access-84k56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.061267 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld6gt\" (UniqueName: \"kubernetes.io/projected/b6f2c666-bd49-42b1-85e4-2eff524f285d-kube-api-access-ld6gt\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.061378 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-additional-scripts\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.061412 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run-ovn\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.061441 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-log-ovn\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.061714 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.061819 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-scripts\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.062330 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84k56\" (UniqueName: \"kubernetes.io/projected/7c54a1b6-2888-48e7-86d3-7e25103e7a6e-kube-api-access-84k56\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.164865 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-additional-scripts\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.165265 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run-ovn\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.165409 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-log-ovn\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.165609 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.165707 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run-ovn\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.165817 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-additional-scripts\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.165844 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.165881 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-log-ovn\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.166037 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-scripts\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.166280 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld6gt\" (UniqueName: \"kubernetes.io/projected/b6f2c666-bd49-42b1-85e4-2eff524f285d-kube-api-access-ld6gt\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.167788 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-scripts\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.191344 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld6gt\" (UniqueName: \"kubernetes.io/projected/b6f2c666-bd49-42b1-85e4-2eff524f285d-kube-api-access-ld6gt\") pod \"ovn-controller-sxhxx-config-45wwk\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.320220 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.493355 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cf34-account-create-nbmf6" event={"ID":"7c54a1b6-2888-48e7-86d3-7e25103e7a6e","Type":"ContainerDied","Data":"23f3e12ad2898f1121d97a7c8bce7f487460f55db7089d9dca62d49c52910576"} Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.493807 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f3e12ad2898f1121d97a7c8bce7f487460f55db7089d9dca62d49c52910576" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.493411 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cf34-account-create-nbmf6" Oct 08 19:51:20 crc kubenswrapper[4750]: I1008 19:51:20.839591 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sxhxx-config-45wwk"] Oct 08 19:51:20 crc kubenswrapper[4750]: W1008 19:51:20.840723 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6f2c666_bd49_42b1_85e4_2eff524f285d.slice/crio-e64f9c1b9c65835fd85d920d95299e0fc8b7ede4f7aa6b029ea94d7634586460 WatchSource:0}: Error finding container e64f9c1b9c65835fd85d920d95299e0fc8b7ede4f7aa6b029ea94d7634586460: Status 404 returned error can't find the container with id e64f9c1b9c65835fd85d920d95299e0fc8b7ede4f7aa6b029ea94d7634586460 Oct 08 19:51:21 crc kubenswrapper[4750]: I1008 19:51:21.506377 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sxhxx-config-45wwk" event={"ID":"b6f2c666-bd49-42b1-85e4-2eff524f285d","Type":"ContainerStarted","Data":"12a691284ef8f077cacb5c421b935acefa9df0db069e0f0c357e73a4f9f185f2"} Oct 08 19:51:21 crc kubenswrapper[4750]: I1008 19:51:21.507142 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sxhxx-config-45wwk" event={"ID":"b6f2c666-bd49-42b1-85e4-2eff524f285d","Type":"ContainerStarted","Data":"e64f9c1b9c65835fd85d920d95299e0fc8b7ede4f7aa6b029ea94d7634586460"} Oct 08 19:51:21 crc kubenswrapper[4750]: I1008 19:51:21.539737 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sxhxx-config-45wwk" podStartSLOduration=2.539713518 podStartE2EDuration="2.539713518s" podCreationTimestamp="2025-10-08 19:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:51:21.529224377 +0000 UTC m=+6037.442195400" watchObservedRunningTime="2025-10-08 19:51:21.539713518 +0000 UTC m=+6037.452684531" Oct 08 19:51:22 crc kubenswrapper[4750]: I1008 19:51:22.518984 4750 generic.go:334] "Generic (PLEG): container finished" podID="b6f2c666-bd49-42b1-85e4-2eff524f285d" containerID="12a691284ef8f077cacb5c421b935acefa9df0db069e0f0c357e73a4f9f185f2" exitCode=0 Oct 08 19:51:22 crc kubenswrapper[4750]: I1008 19:51:22.519060 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sxhxx-config-45wwk" event={"ID":"b6f2c666-bd49-42b1-85e4-2eff524f285d","Type":"ContainerDied","Data":"12a691284ef8f077cacb5c421b935acefa9df0db069e0f0c357e73a4f9f185f2"} Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.841879 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-54fd9b66fd-67k7j"] Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.849655 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.855407 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.855726 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-gzswd" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.855898 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.860999 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-54fd9b66fd-67k7j"] Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.982026 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.993654 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-scripts\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.993712 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-combined-ca-bundle\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.993797 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-config-data\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.993832 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/341cf41b-5181-49d4-a574-f10076a59aa2-octavia-run\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:23 crc kubenswrapper[4750]: I1008 19:51:23.993882 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/341cf41b-5181-49d4-a574-f10076a59aa2-config-data-merged\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.095224 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run\") pod \"b6f2c666-bd49-42b1-85e4-2eff524f285d\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.095276 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-scripts\") pod \"b6f2c666-bd49-42b1-85e4-2eff524f285d\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.095354 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld6gt\" (UniqueName: \"kubernetes.io/projected/b6f2c666-bd49-42b1-85e4-2eff524f285d-kube-api-access-ld6gt\") pod \"b6f2c666-bd49-42b1-85e4-2eff524f285d\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.095372 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-additional-scripts\") pod \"b6f2c666-bd49-42b1-85e4-2eff524f285d\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.095509 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-log-ovn\") pod \"b6f2c666-bd49-42b1-85e4-2eff524f285d\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.095540 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run-ovn\") pod \"b6f2c666-bd49-42b1-85e4-2eff524f285d\" (UID: \"b6f2c666-bd49-42b1-85e4-2eff524f285d\") " Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.095877 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/341cf41b-5181-49d4-a574-f10076a59aa2-config-data-merged\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.096000 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-scripts\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.096035 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-combined-ca-bundle\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.096121 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-config-data\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.096175 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/341cf41b-5181-49d4-a574-f10076a59aa2-octavia-run\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.096867 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/341cf41b-5181-49d4-a574-f10076a59aa2-octavia-run\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.096924 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run" (OuterVolumeSpecName: "var-run") pod "b6f2c666-bd49-42b1-85e4-2eff524f285d" (UID: "b6f2c666-bd49-42b1-85e4-2eff524f285d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.098065 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-scripts" (OuterVolumeSpecName: "scripts") pod "b6f2c666-bd49-42b1-85e4-2eff524f285d" (UID: "b6f2c666-bd49-42b1-85e4-2eff524f285d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.099282 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b6f2c666-bd49-42b1-85e4-2eff524f285d" (UID: "b6f2c666-bd49-42b1-85e4-2eff524f285d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.099343 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/341cf41b-5181-49d4-a574-f10076a59aa2-config-data-merged\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.099377 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b6f2c666-bd49-42b1-85e4-2eff524f285d" (UID: "b6f2c666-bd49-42b1-85e4-2eff524f285d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.100544 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b6f2c666-bd49-42b1-85e4-2eff524f285d" (UID: "b6f2c666-bd49-42b1-85e4-2eff524f285d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.106818 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-scripts\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.107502 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-config-data\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.107938 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f2c666-bd49-42b1-85e4-2eff524f285d-kube-api-access-ld6gt" (OuterVolumeSpecName: "kube-api-access-ld6gt") pod "b6f2c666-bd49-42b1-85e4-2eff524f285d" (UID: "b6f2c666-bd49-42b1-85e4-2eff524f285d"). InnerVolumeSpecName "kube-api-access-ld6gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.109172 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341cf41b-5181-49d4-a574-f10076a59aa2-combined-ca-bundle\") pod \"octavia-api-54fd9b66fd-67k7j\" (UID: \"341cf41b-5181-49d4-a574-f10076a59aa2\") " pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.181815 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.198188 4750 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.198221 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.198232 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld6gt\" (UniqueName: \"kubernetes.io/projected/b6f2c666-bd49-42b1-85e4-2eff524f285d-kube-api-access-ld6gt\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.198245 4750 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f2c666-bd49-42b1-85e4-2eff524f285d-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.198254 4750 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.198262 4750 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6f2c666-bd49-42b1-85e4-2eff524f285d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.541863 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sxhxx-config-45wwk" event={"ID":"b6f2c666-bd49-42b1-85e4-2eff524f285d","Type":"ContainerDied","Data":"e64f9c1b9c65835fd85d920d95299e0fc8b7ede4f7aa6b029ea94d7634586460"} Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.542269 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e64f9c1b9c65835fd85d920d95299e0fc8b7ede4f7aa6b029ea94d7634586460" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.541930 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sxhxx-config-45wwk" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.636700 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sxhxx-config-45wwk"] Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.646185 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sxhxx-config-45wwk"] Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.707088 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-54fd9b66fd-67k7j"] Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.752956 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f2c666-bd49-42b1-85e4-2eff524f285d" path="/var/lib/kubelet/pods/b6f2c666-bd49-42b1-85e4-2eff524f285d/volumes" Oct 08 19:51:24 crc kubenswrapper[4750]: I1008 19:51:24.788539 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-sxhxx" Oct 08 19:51:25 crc kubenswrapper[4750]: I1008 19:51:25.557102 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54fd9b66fd-67k7j" event={"ID":"341cf41b-5181-49d4-a574-f10076a59aa2","Type":"ContainerStarted","Data":"d7306af5f1e266095ebd8ade213ae62c8ccc9c52a4663b93906ab493c97ede0d"} Oct 08 19:51:25 crc kubenswrapper[4750]: I1008 19:51:25.734120 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:51:25 crc kubenswrapper[4750]: E1008 19:51:25.734540 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:51:34 crc kubenswrapper[4750]: I1008 19:51:34.692445 4750 generic.go:334] "Generic (PLEG): container finished" podID="341cf41b-5181-49d4-a574-f10076a59aa2" containerID="926f6de4331be3f702dffcd92e113f41aaded1f313291111370dbb5a87b26d0d" exitCode=0 Oct 08 19:51:34 crc kubenswrapper[4750]: I1008 19:51:34.692606 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54fd9b66fd-67k7j" event={"ID":"341cf41b-5181-49d4-a574-f10076a59aa2","Type":"ContainerDied","Data":"926f6de4331be3f702dffcd92e113f41aaded1f313291111370dbb5a87b26d0d"} Oct 08 19:51:35 crc kubenswrapper[4750]: I1008 19:51:35.707146 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54fd9b66fd-67k7j" event={"ID":"341cf41b-5181-49d4-a574-f10076a59aa2","Type":"ContainerStarted","Data":"861a0d779032b4a964806a1dd7cdbbe1b7f4a5b2db796b4401a8fe930e67adf3"} Oct 08 19:51:35 crc kubenswrapper[4750]: I1008 19:51:35.707747 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:35 crc kubenswrapper[4750]: I1008 19:51:35.707765 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54fd9b66fd-67k7j" event={"ID":"341cf41b-5181-49d4-a574-f10076a59aa2","Type":"ContainerStarted","Data":"0b4adc0ed011ecde2857b69b17a8bf321dcbda8b45120bbfedf8437d9a35d862"} Oct 08 19:51:35 crc kubenswrapper[4750]: I1008 19:51:35.707782 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:35 crc kubenswrapper[4750]: I1008 19:51:35.738639 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-54fd9b66fd-67k7j" podStartSLOduration=3.950458749 podStartE2EDuration="12.738618114s" podCreationTimestamp="2025-10-08 19:51:23 +0000 UTC" firstStartedPulling="2025-10-08 19:51:24.71986327 +0000 UTC m=+6040.632834283" lastFinishedPulling="2025-10-08 19:51:33.508022635 +0000 UTC m=+6049.420993648" observedRunningTime="2025-10-08 19:51:35.736918392 +0000 UTC m=+6051.649889405" watchObservedRunningTime="2025-10-08 19:51:35.738618114 +0000 UTC m=+6051.651589127" Oct 08 19:51:37 crc kubenswrapper[4750]: I1008 19:51:37.736023 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:51:37 crc kubenswrapper[4750]: E1008 19:51:37.737083 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.485027 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-tctqf"] Oct 08 19:51:43 crc kubenswrapper[4750]: E1008 19:51:43.487360 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f2c666-bd49-42b1-85e4-2eff524f285d" containerName="ovn-config" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.487438 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f2c666-bd49-42b1-85e4-2eff524f285d" containerName="ovn-config" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.488428 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f2c666-bd49-42b1-85e4-2eff524f285d" containerName="ovn-config" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.495218 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.498911 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.499402 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.499888 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.506825 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-tctqf"] Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.559955 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.631299 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f9fbe67-14ed-464f-b087-612a789d7d90-config-data\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.631401 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2f9fbe67-14ed-464f-b087-612a789d7d90-hm-ports\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.631578 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2f9fbe67-14ed-464f-b087-612a789d7d90-config-data-merged\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.631607 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f9fbe67-14ed-464f-b087-612a789d7d90-scripts\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.734328 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2f9fbe67-14ed-464f-b087-612a789d7d90-config-data-merged\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.734418 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f9fbe67-14ed-464f-b087-612a789d7d90-scripts\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.734525 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f9fbe67-14ed-464f-b087-612a789d7d90-config-data\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.734744 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2f9fbe67-14ed-464f-b087-612a789d7d90-hm-ports\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.734972 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2f9fbe67-14ed-464f-b087-612a789d7d90-config-data-merged\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.736603 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2f9fbe67-14ed-464f-b087-612a789d7d90-hm-ports\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.749699 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f9fbe67-14ed-464f-b087-612a789d7d90-scripts\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.761966 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f9fbe67-14ed-464f-b087-612a789d7d90-config-data\") pod \"octavia-rsyslog-tctqf\" (UID: \"2f9fbe67-14ed-464f-b087-612a789d7d90\") " pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:43 crc kubenswrapper[4750]: I1008 19:51:43.834849 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.511349 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-tctqf"] Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.611003 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-678599687f-9f4vp"] Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.613023 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.621218 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.628078 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-9f4vp"] Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.760101 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bc718e0-7c35-4361-812b-7d67c978e425-httpd-config\") pod \"octavia-image-upload-678599687f-9f4vp\" (UID: \"2bc718e0-7c35-4361-812b-7d67c978e425\") " pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.761984 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2bc718e0-7c35-4361-812b-7d67c978e425-amphora-image\") pod \"octavia-image-upload-678599687f-9f4vp\" (UID: \"2bc718e0-7c35-4361-812b-7d67c978e425\") " pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.842851 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-tctqf" event={"ID":"2f9fbe67-14ed-464f-b087-612a789d7d90","Type":"ContainerStarted","Data":"736ce33bab85603bccea53ccb20628795cc5692498e10eac5efd538bcb3d18e6"} Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.864397 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bc718e0-7c35-4361-812b-7d67c978e425-httpd-config\") pod \"octavia-image-upload-678599687f-9f4vp\" (UID: \"2bc718e0-7c35-4361-812b-7d67c978e425\") " pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.864509 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2bc718e0-7c35-4361-812b-7d67c978e425-amphora-image\") pod \"octavia-image-upload-678599687f-9f4vp\" (UID: \"2bc718e0-7c35-4361-812b-7d67c978e425\") " pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.865107 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2bc718e0-7c35-4361-812b-7d67c978e425-amphora-image\") pod \"octavia-image-upload-678599687f-9f4vp\" (UID: \"2bc718e0-7c35-4361-812b-7d67c978e425\") " pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.867949 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.887671 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bc718e0-7c35-4361-812b-7d67c978e425-httpd-config\") pod \"octavia-image-upload-678599687f-9f4vp\" (UID: \"2bc718e0-7c35-4361-812b-7d67c978e425\") " pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:51:44 crc kubenswrapper[4750]: I1008 19:51:44.950183 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:51:45 crc kubenswrapper[4750]: I1008 19:51:45.491049 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-9f4vp"] Oct 08 19:51:45 crc kubenswrapper[4750]: W1008 19:51:45.508179 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bc718e0_7c35_4361_812b_7d67c978e425.slice/crio-dc70b13bb033943971f37a47b9b234109ba80948816e5bae2270ed4f9be146c1 WatchSource:0}: Error finding container dc70b13bb033943971f37a47b9b234109ba80948816e5bae2270ed4f9be146c1: Status 404 returned error can't find the container with id dc70b13bb033943971f37a47b9b234109ba80948816e5bae2270ed4f9be146c1 Oct 08 19:51:45 crc kubenswrapper[4750]: I1008 19:51:45.862864 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-9f4vp" event={"ID":"2bc718e0-7c35-4361-812b-7d67c978e425","Type":"ContainerStarted","Data":"dc70b13bb033943971f37a47b9b234109ba80948816e5bae2270ed4f9be146c1"} Oct 08 19:51:47 crc kubenswrapper[4750]: I1008 19:51:47.906307 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-tctqf" event={"ID":"2f9fbe67-14ed-464f-b087-612a789d7d90","Type":"ContainerStarted","Data":"822e002553cb27d5a2016c941939812f0ce3a1cbf7b39e88e03bd34616236d46"} Oct 08 19:51:48 crc kubenswrapper[4750]: I1008 19:51:48.298411 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-54fd9b66fd-67k7j" Oct 08 19:51:49 crc kubenswrapper[4750]: I1008 19:51:49.931003 4750 generic.go:334] "Generic (PLEG): container finished" podID="2f9fbe67-14ed-464f-b087-612a789d7d90" containerID="822e002553cb27d5a2016c941939812f0ce3a1cbf7b39e88e03bd34616236d46" exitCode=0 Oct 08 19:51:49 crc kubenswrapper[4750]: I1008 19:51:49.931105 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-tctqf" event={"ID":"2f9fbe67-14ed-464f-b087-612a789d7d90","Type":"ContainerDied","Data":"822e002553cb27d5a2016c941939812f0ce3a1cbf7b39e88e03bd34616236d46"} Oct 08 19:51:51 crc kubenswrapper[4750]: I1008 19:51:51.734325 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:51:51 crc kubenswrapper[4750]: E1008 19:51:51.735042 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.664921 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-gj8s8"] Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.668518 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.672931 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.682224 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-gj8s8"] Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.781858 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.782030 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-scripts\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.782093 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data-merged\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.782211 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-combined-ca-bundle\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.884420 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-combined-ca-bundle\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.884629 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.884703 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-scripts\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.884735 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data-merged\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.885479 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data-merged\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.896027 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-combined-ca-bundle\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.896115 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.896137 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-scripts\") pod \"octavia-db-sync-gj8s8\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:53 crc kubenswrapper[4750]: I1008 19:51:53.995840 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:51:55 crc kubenswrapper[4750]: I1008 19:51:55.347958 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-gj8s8"] Oct 08 19:51:56 crc kubenswrapper[4750]: I1008 19:51:56.010460 4750 generic.go:334] "Generic (PLEG): container finished" podID="707bf9bd-f37a-4af7-b6b0-e661956d4945" containerID="3cb41901116cdcdba3834086a11522d7fbbcffc8d56127e42dbf38ee4bf50e09" exitCode=0 Oct 08 19:51:56 crc kubenswrapper[4750]: I1008 19:51:56.010525 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gj8s8" event={"ID":"707bf9bd-f37a-4af7-b6b0-e661956d4945","Type":"ContainerDied","Data":"3cb41901116cdcdba3834086a11522d7fbbcffc8d56127e42dbf38ee4bf50e09"} Oct 08 19:51:56 crc kubenswrapper[4750]: I1008 19:51:56.010964 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gj8s8" event={"ID":"707bf9bd-f37a-4af7-b6b0-e661956d4945","Type":"ContainerStarted","Data":"728ac757cf1328b1961d435879392db860d31bfc51f29a652508b7ceb34e542f"} Oct 08 19:51:56 crc kubenswrapper[4750]: I1008 19:51:56.021242 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-9f4vp" event={"ID":"2bc718e0-7c35-4361-812b-7d67c978e425","Type":"ContainerStarted","Data":"5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb"} Oct 08 19:51:56 crc kubenswrapper[4750]: I1008 19:51:56.025771 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-tctqf" event={"ID":"2f9fbe67-14ed-464f-b087-612a789d7d90","Type":"ContainerStarted","Data":"ba1f0e26f17666f73f62197554c235e78309aa605c6085e6a0f671b9daab2863"} Oct 08 19:51:56 crc kubenswrapper[4750]: I1008 19:51:56.026453 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:51:56 crc kubenswrapper[4750]: I1008 19:51:56.088506 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-tctqf" podStartSLOduration=2.658304449 podStartE2EDuration="13.088471407s" podCreationTimestamp="2025-10-08 19:51:43 +0000 UTC" firstStartedPulling="2025-10-08 19:51:44.520929006 +0000 UTC m=+6060.433900029" lastFinishedPulling="2025-10-08 19:51:54.951095984 +0000 UTC m=+6070.864066987" observedRunningTime="2025-10-08 19:51:56.079627107 +0000 UTC m=+6071.992598120" watchObservedRunningTime="2025-10-08 19:51:56.088471407 +0000 UTC m=+6072.001442420" Oct 08 19:51:57 crc kubenswrapper[4750]: I1008 19:51:57.060194 4750 generic.go:334] "Generic (PLEG): container finished" podID="2bc718e0-7c35-4361-812b-7d67c978e425" containerID="5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb" exitCode=0 Oct 08 19:51:57 crc kubenswrapper[4750]: I1008 19:51:57.060301 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-9f4vp" event={"ID":"2bc718e0-7c35-4361-812b-7d67c978e425","Type":"ContainerDied","Data":"5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb"} Oct 08 19:52:00 crc kubenswrapper[4750]: I1008 19:52:00.097055 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gj8s8" event={"ID":"707bf9bd-f37a-4af7-b6b0-e661956d4945","Type":"ContainerStarted","Data":"f784d98fd6552e2a9d4b038c68601cfacf68577ae7977139543ace5cc72c8065"} Oct 08 19:52:00 crc kubenswrapper[4750]: I1008 19:52:00.101909 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-9f4vp" event={"ID":"2bc718e0-7c35-4361-812b-7d67c978e425","Type":"ContainerStarted","Data":"64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322"} Oct 08 19:52:00 crc kubenswrapper[4750]: I1008 19:52:00.135920 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-gj8s8" podStartSLOduration=7.135891975 podStartE2EDuration="7.135891975s" podCreationTimestamp="2025-10-08 19:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:52:00.118596525 +0000 UTC m=+6076.031567588" watchObservedRunningTime="2025-10-08 19:52:00.135891975 +0000 UTC m=+6076.048862988" Oct 08 19:52:00 crc kubenswrapper[4750]: I1008 19:52:00.151339 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-678599687f-9f4vp" podStartSLOduration=6.604315361 podStartE2EDuration="16.151307038s" podCreationTimestamp="2025-10-08 19:51:44 +0000 UTC" firstStartedPulling="2025-10-08 19:51:45.516922563 +0000 UTC m=+6061.429893576" lastFinishedPulling="2025-10-08 19:51:55.06391424 +0000 UTC m=+6070.976885253" observedRunningTime="2025-10-08 19:52:00.135143447 +0000 UTC m=+6076.048114520" watchObservedRunningTime="2025-10-08 19:52:00.151307038 +0000 UTC m=+6076.064278051" Oct 08 19:52:03 crc kubenswrapper[4750]: I1008 19:52:03.141337 4750 generic.go:334] "Generic (PLEG): container finished" podID="707bf9bd-f37a-4af7-b6b0-e661956d4945" containerID="f784d98fd6552e2a9d4b038c68601cfacf68577ae7977139543ace5cc72c8065" exitCode=0 Oct 08 19:52:03 crc kubenswrapper[4750]: I1008 19:52:03.141424 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gj8s8" event={"ID":"707bf9bd-f37a-4af7-b6b0-e661956d4945","Type":"ContainerDied","Data":"f784d98fd6552e2a9d4b038c68601cfacf68577ae7977139543ace5cc72c8065"} Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.593620 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.652243 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-combined-ca-bundle\") pod \"707bf9bd-f37a-4af7-b6b0-e661956d4945\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.652632 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data\") pod \"707bf9bd-f37a-4af7-b6b0-e661956d4945\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.652843 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-scripts\") pod \"707bf9bd-f37a-4af7-b6b0-e661956d4945\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.652976 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data-merged\") pod \"707bf9bd-f37a-4af7-b6b0-e661956d4945\" (UID: \"707bf9bd-f37a-4af7-b6b0-e661956d4945\") " Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.662011 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data" (OuterVolumeSpecName: "config-data") pod "707bf9bd-f37a-4af7-b6b0-e661956d4945" (UID: "707bf9bd-f37a-4af7-b6b0-e661956d4945"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.666439 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-scripts" (OuterVolumeSpecName: "scripts") pod "707bf9bd-f37a-4af7-b6b0-e661956d4945" (UID: "707bf9bd-f37a-4af7-b6b0-e661956d4945"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.688316 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "707bf9bd-f37a-4af7-b6b0-e661956d4945" (UID: "707bf9bd-f37a-4af7-b6b0-e661956d4945"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.693118 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "707bf9bd-f37a-4af7-b6b0-e661956d4945" (UID: "707bf9bd-f37a-4af7-b6b0-e661956d4945"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.757274 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.757462 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.757660 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bf9bd-f37a-4af7-b6b0-e661956d4945-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:52:04 crc kubenswrapper[4750]: I1008 19:52:04.757772 4750 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/707bf9bd-f37a-4af7-b6b0-e661956d4945-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 08 19:52:05 crc kubenswrapper[4750]: I1008 19:52:05.043541 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-gnlng"] Oct 08 19:52:05 crc kubenswrapper[4750]: I1008 19:52:05.053810 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-gnlng"] Oct 08 19:52:05 crc kubenswrapper[4750]: I1008 19:52:05.172892 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gj8s8" event={"ID":"707bf9bd-f37a-4af7-b6b0-e661956d4945","Type":"ContainerDied","Data":"728ac757cf1328b1961d435879392db860d31bfc51f29a652508b7ceb34e542f"} Oct 08 19:52:05 crc kubenswrapper[4750]: I1008 19:52:05.172942 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="728ac757cf1328b1961d435879392db860d31bfc51f29a652508b7ceb34e542f" Oct 08 19:52:05 crc kubenswrapper[4750]: I1008 19:52:05.173022 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-gj8s8" Oct 08 19:52:06 crc kubenswrapper[4750]: I1008 19:52:06.734356 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:52:06 crc kubenswrapper[4750]: E1008 19:52:06.736312 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:52:06 crc kubenswrapper[4750]: I1008 19:52:06.747284 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bf32b68-0825-4468-b0b5-e4eb2eb7d10f" path="/var/lib/kubelet/pods/4bf32b68-0825-4468-b0b5-e4eb2eb7d10f/volumes" Oct 08 19:52:13 crc kubenswrapper[4750]: I1008 19:52:13.874439 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-tctqf" Oct 08 19:52:15 crc kubenswrapper[4750]: I1008 19:52:15.046379 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2123-account-create-n546k"] Oct 08 19:52:15 crc kubenswrapper[4750]: I1008 19:52:15.057596 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2123-account-create-n546k"] Oct 08 19:52:16 crc kubenswrapper[4750]: I1008 19:52:16.759189 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ab5e96-40ba-4396-99bf-f77ecb9489f2" path="/var/lib/kubelet/pods/97ab5e96-40ba-4396-99bf-f77ecb9489f2/volumes" Oct 08 19:52:17 crc kubenswrapper[4750]: I1008 19:52:17.734787 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:52:17 crc kubenswrapper[4750]: E1008 19:52:17.735644 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:52:22 crc kubenswrapper[4750]: I1008 19:52:22.053889 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fndk5"] Oct 08 19:52:22 crc kubenswrapper[4750]: I1008 19:52:22.068629 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fndk5"] Oct 08 19:52:22 crc kubenswrapper[4750]: I1008 19:52:22.749647 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eae594e-7b85-4ec1-b9ca-5ab025352efa" path="/var/lib/kubelet/pods/8eae594e-7b85-4ec1-b9ca-5ab025352efa/volumes" Oct 08 19:52:29 crc kubenswrapper[4750]: I1008 19:52:29.736767 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:52:29 crc kubenswrapper[4750]: E1008 19:52:29.740152 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:52:36 crc kubenswrapper[4750]: I1008 19:52:36.693720 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-678599687f-9f4vp"] Oct 08 19:52:36 crc kubenswrapper[4750]: I1008 19:52:36.694645 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-678599687f-9f4vp" podUID="2bc718e0-7c35-4361-812b-7d67c978e425" containerName="octavia-amphora-httpd" containerID="cri-o://64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322" gracePeriod=30 Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.387003 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.452267 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bc718e0-7c35-4361-812b-7d67c978e425-httpd-config\") pod \"2bc718e0-7c35-4361-812b-7d67c978e425\" (UID: \"2bc718e0-7c35-4361-812b-7d67c978e425\") " Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.452462 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2bc718e0-7c35-4361-812b-7d67c978e425-amphora-image\") pod \"2bc718e0-7c35-4361-812b-7d67c978e425\" (UID: \"2bc718e0-7c35-4361-812b-7d67c978e425\") " Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.488324 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc718e0-7c35-4361-812b-7d67c978e425-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2bc718e0-7c35-4361-812b-7d67c978e425" (UID: "2bc718e0-7c35-4361-812b-7d67c978e425"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.520999 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc718e0-7c35-4361-812b-7d67c978e425-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "2bc718e0-7c35-4361-812b-7d67c978e425" (UID: "2bc718e0-7c35-4361-812b-7d67c978e425"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.555466 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bc718e0-7c35-4361-812b-7d67c978e425-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.555510 4750 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2bc718e0-7c35-4361-812b-7d67c978e425-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.580617 4750 generic.go:334] "Generic (PLEG): container finished" podID="2bc718e0-7c35-4361-812b-7d67c978e425" containerID="64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322" exitCode=0 Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.580666 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-9f4vp" event={"ID":"2bc718e0-7c35-4361-812b-7d67c978e425","Type":"ContainerDied","Data":"64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322"} Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.580707 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-9f4vp" event={"ID":"2bc718e0-7c35-4361-812b-7d67c978e425","Type":"ContainerDied","Data":"dc70b13bb033943971f37a47b9b234109ba80948816e5bae2270ed4f9be146c1"} Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.580735 4750 scope.go:117] "RemoveContainer" containerID="64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.580766 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-9f4vp" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.622047 4750 scope.go:117] "RemoveContainer" containerID="5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.623380 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-678599687f-9f4vp"] Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.636730 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-678599687f-9f4vp"] Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.662903 4750 scope.go:117] "RemoveContainer" containerID="64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322" Oct 08 19:52:37 crc kubenswrapper[4750]: E1008 19:52:37.663930 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322\": container with ID starting with 64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322 not found: ID does not exist" containerID="64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.663970 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322"} err="failed to get container status \"64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322\": rpc error: code = NotFound desc = could not find container \"64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322\": container with ID starting with 64e93936098a337d2b18a4390a5ede48717bd8145b9bc0687538d7ed39ddd322 not found: ID does not exist" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.664002 4750 scope.go:117] "RemoveContainer" containerID="5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb" Oct 08 19:52:37 crc kubenswrapper[4750]: E1008 19:52:37.665242 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb\": container with ID starting with 5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb not found: ID does not exist" containerID="5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb" Oct 08 19:52:37 crc kubenswrapper[4750]: I1008 19:52:37.665312 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb"} err="failed to get container status \"5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb\": rpc error: code = NotFound desc = could not find container \"5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb\": container with ID starting with 5569600d79869e1e130d4fa1358b611b2354449034484258563af2971dfd9cfb not found: ID does not exist" Oct 08 19:52:38 crc kubenswrapper[4750]: I1008 19:52:38.751511 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc718e0-7c35-4361-812b-7d67c978e425" path="/var/lib/kubelet/pods/2bc718e0-7c35-4361-812b-7d67c978e425/volumes" Oct 08 19:52:44 crc kubenswrapper[4750]: I1008 19:52:44.734096 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:52:44 crc kubenswrapper[4750]: E1008 19:52:44.735130 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.139992 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-678599687f-7ktf9"] Oct 08 19:52:46 crc kubenswrapper[4750]: E1008 19:52:46.140966 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc718e0-7c35-4361-812b-7d67c978e425" containerName="octavia-amphora-httpd" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.140987 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc718e0-7c35-4361-812b-7d67c978e425" containerName="octavia-amphora-httpd" Oct 08 19:52:46 crc kubenswrapper[4750]: E1008 19:52:46.141008 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707bf9bd-f37a-4af7-b6b0-e661956d4945" containerName="init" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.141017 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="707bf9bd-f37a-4af7-b6b0-e661956d4945" containerName="init" Oct 08 19:52:46 crc kubenswrapper[4750]: E1008 19:52:46.141029 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707bf9bd-f37a-4af7-b6b0-e661956d4945" containerName="octavia-db-sync" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.141037 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="707bf9bd-f37a-4af7-b6b0-e661956d4945" containerName="octavia-db-sync" Oct 08 19:52:46 crc kubenswrapper[4750]: E1008 19:52:46.141068 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc718e0-7c35-4361-812b-7d67c978e425" containerName="init" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.141075 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc718e0-7c35-4361-812b-7d67c978e425" containerName="init" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.141886 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="707bf9bd-f37a-4af7-b6b0-e661956d4945" containerName="octavia-db-sync" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.141913 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc718e0-7c35-4361-812b-7d67c978e425" containerName="octavia-amphora-httpd" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.143348 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-7ktf9" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.149408 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.162276 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-7ktf9"] Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.263433 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd290ec9-232e-4879-a525-01c6bfb72bc8-httpd-config\") pod \"octavia-image-upload-678599687f-7ktf9\" (UID: \"fd290ec9-232e-4879-a525-01c6bfb72bc8\") " pod="openstack/octavia-image-upload-678599687f-7ktf9" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.263935 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/fd290ec9-232e-4879-a525-01c6bfb72bc8-amphora-image\") pod \"octavia-image-upload-678599687f-7ktf9\" (UID: \"fd290ec9-232e-4879-a525-01c6bfb72bc8\") " pod="openstack/octavia-image-upload-678599687f-7ktf9" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.366923 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd290ec9-232e-4879-a525-01c6bfb72bc8-httpd-config\") pod \"octavia-image-upload-678599687f-7ktf9\" (UID: \"fd290ec9-232e-4879-a525-01c6bfb72bc8\") " pod="openstack/octavia-image-upload-678599687f-7ktf9" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.367172 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/fd290ec9-232e-4879-a525-01c6bfb72bc8-amphora-image\") pod \"octavia-image-upload-678599687f-7ktf9\" (UID: \"fd290ec9-232e-4879-a525-01c6bfb72bc8\") " pod="openstack/octavia-image-upload-678599687f-7ktf9" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.368090 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/fd290ec9-232e-4879-a525-01c6bfb72bc8-amphora-image\") pod \"octavia-image-upload-678599687f-7ktf9\" (UID: \"fd290ec9-232e-4879-a525-01c6bfb72bc8\") " pod="openstack/octavia-image-upload-678599687f-7ktf9" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.375663 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd290ec9-232e-4879-a525-01c6bfb72bc8-httpd-config\") pod \"octavia-image-upload-678599687f-7ktf9\" (UID: \"fd290ec9-232e-4879-a525-01c6bfb72bc8\") " pod="openstack/octavia-image-upload-678599687f-7ktf9" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.470061 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-7ktf9" Oct 08 19:52:46 crc kubenswrapper[4750]: I1008 19:52:46.974857 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-7ktf9"] Oct 08 19:52:47 crc kubenswrapper[4750]: I1008 19:52:47.701493 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-7ktf9" event={"ID":"fd290ec9-232e-4879-a525-01c6bfb72bc8","Type":"ContainerStarted","Data":"8871c728dcfe6d8712196a54c2b3d135b3ce6d9e392155053ea7cfd33c186101"} Oct 08 19:52:48 crc kubenswrapper[4750]: I1008 19:52:48.716285 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-7ktf9" event={"ID":"fd290ec9-232e-4879-a525-01c6bfb72bc8","Type":"ContainerStarted","Data":"0febf73fb9e96db1aacb4a4a1a263aa21ef0394f2e72093d537705eb912e3b86"} Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.527493 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-4qbrc"] Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.530400 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.534441 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.534478 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.534792 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.546934 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-4qbrc"] Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.591153 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-scripts\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.591200 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b5222986-32a6-4c3e-97d2-7037c64a08dc-config-data-merged\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.591400 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-amphora-certs\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.591574 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-combined-ca-bundle\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.591604 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-config-data\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.591669 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b5222986-32a6-4c3e-97d2-7037c64a08dc-hm-ports\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.693989 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-combined-ca-bundle\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.694075 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-config-data\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.694164 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b5222986-32a6-4c3e-97d2-7037c64a08dc-hm-ports\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.694244 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-scripts\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.694275 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b5222986-32a6-4c3e-97d2-7037c64a08dc-config-data-merged\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.694345 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-amphora-certs\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.696579 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b5222986-32a6-4c3e-97d2-7037c64a08dc-config-data-merged\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.696737 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b5222986-32a6-4c3e-97d2-7037c64a08dc-hm-ports\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.704611 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-scripts\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.705275 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-combined-ca-bundle\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.710508 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-config-data\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.714254 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/b5222986-32a6-4c3e-97d2-7037c64a08dc-amphora-certs\") pod \"octavia-healthmanager-4qbrc\" (UID: \"b5222986-32a6-4c3e-97d2-7037c64a08dc\") " pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.750185 4750 generic.go:334] "Generic (PLEG): container finished" podID="fd290ec9-232e-4879-a525-01c6bfb72bc8" containerID="0febf73fb9e96db1aacb4a4a1a263aa21ef0394f2e72093d537705eb912e3b86" exitCode=0 Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.750246 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-7ktf9" event={"ID":"fd290ec9-232e-4879-a525-01c6bfb72bc8","Type":"ContainerDied","Data":"0febf73fb9e96db1aacb4a4a1a263aa21ef0394f2e72093d537705eb912e3b86"} Oct 08 19:52:51 crc kubenswrapper[4750]: I1008 19:52:51.855217 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:52:52 crc kubenswrapper[4750]: I1008 19:52:52.494677 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-4qbrc"] Oct 08 19:52:52 crc kubenswrapper[4750]: I1008 19:52:52.762802 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-7ktf9" event={"ID":"fd290ec9-232e-4879-a525-01c6bfb72bc8","Type":"ContainerStarted","Data":"63746ade7011bbb4be63b9350b9c80e404428bf2f44a7a561e24ea38321caa8a"} Oct 08 19:52:52 crc kubenswrapper[4750]: I1008 19:52:52.763901 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-4qbrc" event={"ID":"b5222986-32a6-4c3e-97d2-7037c64a08dc","Type":"ContainerStarted","Data":"acff75e562fd6ffdd940e03e225962bc076c5901be51e43c844134195ec4085a"} Oct 08 19:52:52 crc kubenswrapper[4750]: I1008 19:52:52.789806 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-678599687f-7ktf9" podStartSLOduration=6.246726391 podStartE2EDuration="6.789780356s" podCreationTimestamp="2025-10-08 19:52:46 +0000 UTC" firstStartedPulling="2025-10-08 19:52:46.986301811 +0000 UTC m=+6122.899272834" lastFinishedPulling="2025-10-08 19:52:47.529355786 +0000 UTC m=+6123.442326799" observedRunningTime="2025-10-08 19:52:52.7862747 +0000 UTC m=+6128.699245713" watchObservedRunningTime="2025-10-08 19:52:52.789780356 +0000 UTC m=+6128.702751379" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.047129 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rc4g4"] Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.073500 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rc4g4"] Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.126256 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-ksff9"] Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.128578 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.131483 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.131726 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.160991 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-ksff9"] Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.257562 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/56ea204e-1ef4-4393-890d-e772748890b3-config-data-merged\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.257638 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-combined-ca-bundle\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.257732 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/56ea204e-1ef4-4393-890d-e772748890b3-hm-ports\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.257761 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-amphora-certs\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.257782 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-config-data\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.257811 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-scripts\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.277444 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rhzfj"] Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.287928 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.292180 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rhzfj"] Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360135 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/56ea204e-1ef4-4393-890d-e772748890b3-hm-ports\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360196 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-amphora-certs\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360221 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-config-data\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360249 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-scripts\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360295 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-catalog-content\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360326 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-utilities\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360401 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8c89\" (UniqueName: \"kubernetes.io/projected/45a041c6-9181-488d-bc14-d46ec233f131-kube-api-access-t8c89\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360428 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/56ea204e-1ef4-4393-890d-e772748890b3-config-data-merged\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.360456 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-combined-ca-bundle\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.362266 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/56ea204e-1ef4-4393-890d-e772748890b3-config-data-merged\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.362997 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/56ea204e-1ef4-4393-890d-e772748890b3-hm-ports\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.369639 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-scripts\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.370518 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-config-data\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.371570 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-amphora-certs\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.389125 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56ea204e-1ef4-4393-890d-e772748890b3-combined-ca-bundle\") pod \"octavia-housekeeping-ksff9\" (UID: \"56ea204e-1ef4-4393-890d-e772748890b3\") " pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.462074 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8c89\" (UniqueName: \"kubernetes.io/projected/45a041c6-9181-488d-bc14-d46ec233f131-kube-api-access-t8c89\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.462228 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-catalog-content\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.462261 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-utilities\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.462813 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-catalog-content\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.462833 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-utilities\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.464314 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.484882 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8c89\" (UniqueName: \"kubernetes.io/projected/45a041c6-9181-488d-bc14-d46ec233f131-kube-api-access-t8c89\") pod \"certified-operators-rhzfj\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.616137 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.812019 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a86550-1abc-4058-98fb-aca6386d130b" path="/var/lib/kubelet/pods/c2a86550-1abc-4058-98fb-aca6386d130b/volumes" Oct 08 19:52:54 crc kubenswrapper[4750]: I1008 19:52:54.843854 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-4qbrc" event={"ID":"b5222986-32a6-4c3e-97d2-7037c64a08dc","Type":"ContainerStarted","Data":"dd1f10fd0e993210ce68691e80b8be4ee642de00c2dc88c9147dbc3d96897fc8"} Oct 08 19:52:55 crc kubenswrapper[4750]: I1008 19:52:55.218284 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-ksff9"] Oct 08 19:52:55 crc kubenswrapper[4750]: I1008 19:52:55.271319 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rhzfj"] Oct 08 19:52:55 crc kubenswrapper[4750]: I1008 19:52:55.857918 4750 generic.go:334] "Generic (PLEG): container finished" podID="45a041c6-9181-488d-bc14-d46ec233f131" containerID="380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512" exitCode=0 Oct 08 19:52:55 crc kubenswrapper[4750]: I1008 19:52:55.858044 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhzfj" event={"ID":"45a041c6-9181-488d-bc14-d46ec233f131","Type":"ContainerDied","Data":"380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512"} Oct 08 19:52:55 crc kubenswrapper[4750]: I1008 19:52:55.858318 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhzfj" event={"ID":"45a041c6-9181-488d-bc14-d46ec233f131","Type":"ContainerStarted","Data":"0c7534d59c668e62567eccfdaef9c10cd2a33f42ce3a6a5b47f7dc2d75cbacb6"} Oct 08 19:52:55 crc kubenswrapper[4750]: I1008 19:52:55.861679 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ksff9" event={"ID":"56ea204e-1ef4-4393-890d-e772748890b3","Type":"ContainerStarted","Data":"fde8df392f22d64fba9a965a112b1df6a472084746ec361e80174384c1000c19"} Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.182448 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-5flzb"] Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.184802 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.188107 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.190534 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.198047 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-5flzb"] Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.316371 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-config-data-merged\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.316444 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-config-data\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.316626 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-scripts\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.316656 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-hm-ports\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.316776 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-amphora-certs\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.316818 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-combined-ca-bundle\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.419142 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-scripts\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.419593 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-hm-ports\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.419713 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-amphora-certs\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.419743 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-combined-ca-bundle\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.419858 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-config-data-merged\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.420726 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-config-data\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.420810 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-hm-ports\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.421259 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-config-data-merged\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.426931 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-amphora-certs\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.427482 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-combined-ca-bundle\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.429044 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-scripts\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.429439 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8287d9ba-be09-4fc6-8d0d-c2ba26de1279-config-data\") pod \"octavia-worker-5flzb\" (UID: \"8287d9ba-be09-4fc6-8d0d-c2ba26de1279\") " pod="openstack/octavia-worker-5flzb" Oct 08 19:52:56 crc kubenswrapper[4750]: I1008 19:52:56.511148 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-5flzb" Oct 08 19:52:57 crc kubenswrapper[4750]: I1008 19:52:56.873587 4750 generic.go:334] "Generic (PLEG): container finished" podID="b5222986-32a6-4c3e-97d2-7037c64a08dc" containerID="dd1f10fd0e993210ce68691e80b8be4ee642de00c2dc88c9147dbc3d96897fc8" exitCode=0 Oct 08 19:52:57 crc kubenswrapper[4750]: I1008 19:52:56.873690 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-4qbrc" event={"ID":"b5222986-32a6-4c3e-97d2-7037c64a08dc","Type":"ContainerDied","Data":"dd1f10fd0e993210ce68691e80b8be4ee642de00c2dc88c9147dbc3d96897fc8"} Oct 08 19:52:57 crc kubenswrapper[4750]: I1008 19:52:57.736531 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:52:57 crc kubenswrapper[4750]: E1008 19:52:57.737462 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:52:57 crc kubenswrapper[4750]: I1008 19:52:57.890867 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhzfj" event={"ID":"45a041c6-9181-488d-bc14-d46ec233f131","Type":"ContainerStarted","Data":"65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba"} Oct 08 19:52:58 crc kubenswrapper[4750]: I1008 19:52:58.103359 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-5flzb"] Oct 08 19:52:58 crc kubenswrapper[4750]: W1008 19:52:58.133789 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8287d9ba_be09_4fc6_8d0d_c2ba26de1279.slice/crio-e5b86c6ddd7bf47ec063807fdc6102e2e3ea5dcf980e9dbedf7dc96d4a36a517 WatchSource:0}: Error finding container e5b86c6ddd7bf47ec063807fdc6102e2e3ea5dcf980e9dbedf7dc96d4a36a517: Status 404 returned error can't find the container with id e5b86c6ddd7bf47ec063807fdc6102e2e3ea5dcf980e9dbedf7dc96d4a36a517 Oct 08 19:52:58 crc kubenswrapper[4750]: I1008 19:52:58.908333 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ksff9" event={"ID":"56ea204e-1ef4-4393-890d-e772748890b3","Type":"ContainerStarted","Data":"294a4bd47f18bb95985a9f5ec39d921d157082094246a262c520e11f4d1aa0b4"} Oct 08 19:52:58 crc kubenswrapper[4750]: I1008 19:52:58.911778 4750 generic.go:334] "Generic (PLEG): container finished" podID="45a041c6-9181-488d-bc14-d46ec233f131" containerID="65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba" exitCode=0 Oct 08 19:52:58 crc kubenswrapper[4750]: I1008 19:52:58.911915 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhzfj" event={"ID":"45a041c6-9181-488d-bc14-d46ec233f131","Type":"ContainerDied","Data":"65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba"} Oct 08 19:52:58 crc kubenswrapper[4750]: I1008 19:52:58.917337 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5flzb" event={"ID":"8287d9ba-be09-4fc6-8d0d-c2ba26de1279","Type":"ContainerStarted","Data":"e5b86c6ddd7bf47ec063807fdc6102e2e3ea5dcf980e9dbedf7dc96d4a36a517"} Oct 08 19:52:58 crc kubenswrapper[4750]: I1008 19:52:58.924101 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-4qbrc" event={"ID":"b5222986-32a6-4c3e-97d2-7037c64a08dc","Type":"ContainerStarted","Data":"77bdca6a1c5b316db97beee4bee95bfdc06b12152d038359488cb07721eeacee"} Oct 08 19:52:59 crc kubenswrapper[4750]: I1008 19:52:59.941837 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:53:00 crc kubenswrapper[4750]: I1008 19:53:00.014242 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-4qbrc" podStartSLOduration=9.014216828 podStartE2EDuration="9.014216828s" podCreationTimestamp="2025-10-08 19:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:52:59.985353489 +0000 UTC m=+6135.898324502" watchObservedRunningTime="2025-10-08 19:53:00.014216828 +0000 UTC m=+6135.927187841" Oct 08 19:53:00 crc kubenswrapper[4750]: I1008 19:53:00.955946 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhzfj" event={"ID":"45a041c6-9181-488d-bc14-d46ec233f131","Type":"ContainerStarted","Data":"4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4"} Oct 08 19:53:00 crc kubenswrapper[4750]: I1008 19:53:00.963125 4750 generic.go:334] "Generic (PLEG): container finished" podID="56ea204e-1ef4-4393-890d-e772748890b3" containerID="294a4bd47f18bb95985a9f5ec39d921d157082094246a262c520e11f4d1aa0b4" exitCode=0 Oct 08 19:53:00 crc kubenswrapper[4750]: I1008 19:53:00.963226 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ksff9" event={"ID":"56ea204e-1ef4-4393-890d-e772748890b3","Type":"ContainerDied","Data":"294a4bd47f18bb95985a9f5ec39d921d157082094246a262c520e11f4d1aa0b4"} Oct 08 19:53:01 crc kubenswrapper[4750]: I1008 19:53:01.012682 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rhzfj" podStartSLOduration=2.989984084 podStartE2EDuration="7.012658186s" podCreationTimestamp="2025-10-08 19:52:54 +0000 UTC" firstStartedPulling="2025-10-08 19:52:55.859865191 +0000 UTC m=+6131.772836234" lastFinishedPulling="2025-10-08 19:52:59.882539323 +0000 UTC m=+6135.795510336" observedRunningTime="2025-10-08 19:53:00.986978228 +0000 UTC m=+6136.899949251" watchObservedRunningTime="2025-10-08 19:53:01.012658186 +0000 UTC m=+6136.925629199" Oct 08 19:53:01 crc kubenswrapper[4750]: I1008 19:53:01.982350 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ksff9" event={"ID":"56ea204e-1ef4-4393-890d-e772748890b3","Type":"ContainerStarted","Data":"46fac604061bfccb4693e0f0215279be76ec33b966649a64624e5a111b8fb848"} Oct 08 19:53:01 crc kubenswrapper[4750]: I1008 19:53:01.983150 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:53:01 crc kubenswrapper[4750]: I1008 19:53:01.990908 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5flzb" event={"ID":"8287d9ba-be09-4fc6-8d0d-c2ba26de1279","Type":"ContainerStarted","Data":"6666ab43ff78812a7e4d9324a506cc5afadae51d35e91909904bffa5af354ed0"} Oct 08 19:53:02 crc kubenswrapper[4750]: I1008 19:53:02.010706 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-ksff9" podStartSLOduration=5.83575206 podStartE2EDuration="8.010682784s" podCreationTimestamp="2025-10-08 19:52:54 +0000 UTC" firstStartedPulling="2025-10-08 19:52:55.231972537 +0000 UTC m=+6131.144943560" lastFinishedPulling="2025-10-08 19:52:57.406903261 +0000 UTC m=+6133.319874284" observedRunningTime="2025-10-08 19:53:02.004598363 +0000 UTC m=+6137.917569386" watchObservedRunningTime="2025-10-08 19:53:02.010682784 +0000 UTC m=+6137.923653797" Oct 08 19:53:03 crc kubenswrapper[4750]: I1008 19:53:03.003735 4750 generic.go:334] "Generic (PLEG): container finished" podID="8287d9ba-be09-4fc6-8d0d-c2ba26de1279" containerID="6666ab43ff78812a7e4d9324a506cc5afadae51d35e91909904bffa5af354ed0" exitCode=0 Oct 08 19:53:03 crc kubenswrapper[4750]: I1008 19:53:03.005303 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5flzb" event={"ID":"8287d9ba-be09-4fc6-8d0d-c2ba26de1279","Type":"ContainerDied","Data":"6666ab43ff78812a7e4d9324a506cc5afadae51d35e91909904bffa5af354ed0"} Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.047210 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5flzb" event={"ID":"8287d9ba-be09-4fc6-8d0d-c2ba26de1279","Type":"ContainerStarted","Data":"7089c8b52107ef5091ea3d369be5025627ddbdb4456b92b806b1a9f06111996c"} Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.047704 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-5flzb" Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.051444 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8c11-account-create-d29w4"] Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.066866 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8c11-account-create-d29w4"] Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.081695 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-5flzb" podStartSLOduration=5.495002039 podStartE2EDuration="8.081670543s" podCreationTimestamp="2025-10-08 19:52:56 +0000 UTC" firstStartedPulling="2025-10-08 19:52:58.138270678 +0000 UTC m=+6134.051241681" lastFinishedPulling="2025-10-08 19:53:00.724939172 +0000 UTC m=+6136.637910185" observedRunningTime="2025-10-08 19:53:04.07230876 +0000 UTC m=+6139.985279773" watchObservedRunningTime="2025-10-08 19:53:04.081670543 +0000 UTC m=+6139.994641556" Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.617214 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.617720 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.755259 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc3046e-6de5-43ad-af62-252fa56a5c01" path="/var/lib/kubelet/pods/cfc3046e-6de5-43ad-af62-252fa56a5c01/volumes" Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.950621 4750 scope.go:117] "RemoveContainer" containerID="11951490ed43f279736dc0e8be8a35da0371c45b7c7133ae81d656fe4f399c8b" Oct 08 19:53:04 crc kubenswrapper[4750]: I1008 19:53:04.979519 4750 scope.go:117] "RemoveContainer" containerID="345952453cb80c23c553ab767e9f173f3e6172cb77539869f6b5805ed5a53682" Oct 08 19:53:05 crc kubenswrapper[4750]: I1008 19:53:05.027643 4750 scope.go:117] "RemoveContainer" containerID="d99b74b3c8d304c97e5b923cfa35a88af1e44b60c0a35db6166dec217de8ae30" Oct 08 19:53:05 crc kubenswrapper[4750]: I1008 19:53:05.071036 4750 scope.go:117] "RemoveContainer" containerID="9848e83ea6f9a85bab38f4fceb1da300af516cddef7a398f1a26ee1270e3e863" Oct 08 19:53:05 crc kubenswrapper[4750]: I1008 19:53:05.127251 4750 scope.go:117] "RemoveContainer" containerID="fb9628811df103bd14e9eeb2b06a8f6c4c1b89ed3cc5d0f159b96c35aaf66367" Oct 08 19:53:05 crc kubenswrapper[4750]: I1008 19:53:05.160211 4750 scope.go:117] "RemoveContainer" containerID="67d87465196007bd22f34da4339127e60e1abd2ec687ebf984f89098e900d038" Oct 08 19:53:05 crc kubenswrapper[4750]: I1008 19:53:05.206414 4750 scope.go:117] "RemoveContainer" containerID="08772aaace6893b2ec900936b3a724a48803643bc4eca66c8a2d1539158254b1" Oct 08 19:53:05 crc kubenswrapper[4750]: I1008 19:53:05.669166 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rhzfj" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="registry-server" probeResult="failure" output=< Oct 08 19:53:05 crc kubenswrapper[4750]: timeout: failed to connect service ":50051" within 1s Oct 08 19:53:05 crc kubenswrapper[4750]: > Oct 08 19:53:06 crc kubenswrapper[4750]: I1008 19:53:06.898114 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-4qbrc" Oct 08 19:53:09 crc kubenswrapper[4750]: I1008 19:53:09.516799 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-ksff9" Oct 08 19:53:11 crc kubenswrapper[4750]: I1008 19:53:11.554373 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-5flzb" Oct 08 19:53:12 crc kubenswrapper[4750]: I1008 19:53:12.734681 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:53:13 crc kubenswrapper[4750]: I1008 19:53:13.160761 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"adabf6821006effe51695ae643fb86b44dd24b6d6a52aca3ce60c41928a0f63e"} Oct 08 19:53:14 crc kubenswrapper[4750]: I1008 19:53:14.037753 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-x7x4q"] Oct 08 19:53:14 crc kubenswrapper[4750]: I1008 19:53:14.049137 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-x7x4q"] Oct 08 19:53:14 crc kubenswrapper[4750]: I1008 19:53:14.676779 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:53:14 crc kubenswrapper[4750]: I1008 19:53:14.749611 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a65953-f112-4073-af48-43f88cff1bb9" path="/var/lib/kubelet/pods/e6a65953-f112-4073-af48-43f88cff1bb9/volumes" Oct 08 19:53:14 crc kubenswrapper[4750]: I1008 19:53:14.751128 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:53:14 crc kubenswrapper[4750]: I1008 19:53:14.921219 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rhzfj"] Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.195693 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rhzfj" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="registry-server" containerID="cri-o://4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4" gracePeriod=2 Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.786041 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.835900 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-catalog-content\") pod \"45a041c6-9181-488d-bc14-d46ec233f131\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.836037 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8c89\" (UniqueName: \"kubernetes.io/projected/45a041c6-9181-488d-bc14-d46ec233f131-kube-api-access-t8c89\") pod \"45a041c6-9181-488d-bc14-d46ec233f131\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.836087 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-utilities\") pod \"45a041c6-9181-488d-bc14-d46ec233f131\" (UID: \"45a041c6-9181-488d-bc14-d46ec233f131\") " Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.837186 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-utilities" (OuterVolumeSpecName: "utilities") pod "45a041c6-9181-488d-bc14-d46ec233f131" (UID: "45a041c6-9181-488d-bc14-d46ec233f131"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.854261 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a041c6-9181-488d-bc14-d46ec233f131-kube-api-access-t8c89" (OuterVolumeSpecName: "kube-api-access-t8c89") pod "45a041c6-9181-488d-bc14-d46ec233f131" (UID: "45a041c6-9181-488d-bc14-d46ec233f131"). InnerVolumeSpecName "kube-api-access-t8c89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.885692 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45a041c6-9181-488d-bc14-d46ec233f131" (UID: "45a041c6-9181-488d-bc14-d46ec233f131"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.939694 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.939753 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8c89\" (UniqueName: \"kubernetes.io/projected/45a041c6-9181-488d-bc14-d46ec233f131-kube-api-access-t8c89\") on node \"crc\" DevicePath \"\"" Oct 08 19:53:16 crc kubenswrapper[4750]: I1008 19:53:16.939772 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a041c6-9181-488d-bc14-d46ec233f131-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.212659 4750 generic.go:334] "Generic (PLEG): container finished" podID="45a041c6-9181-488d-bc14-d46ec233f131" containerID="4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4" exitCode=0 Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.212714 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhzfj" event={"ID":"45a041c6-9181-488d-bc14-d46ec233f131","Type":"ContainerDied","Data":"4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4"} Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.212765 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhzfj" event={"ID":"45a041c6-9181-488d-bc14-d46ec233f131","Type":"ContainerDied","Data":"0c7534d59c668e62567eccfdaef9c10cd2a33f42ce3a6a5b47f7dc2d75cbacb6"} Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.212782 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhzfj" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.212793 4750 scope.go:117] "RemoveContainer" containerID="4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.274229 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rhzfj"] Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.274994 4750 scope.go:117] "RemoveContainer" containerID="65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.294233 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rhzfj"] Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.303592 4750 scope.go:117] "RemoveContainer" containerID="380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.365976 4750 scope.go:117] "RemoveContainer" containerID="4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4" Oct 08 19:53:17 crc kubenswrapper[4750]: E1008 19:53:17.366702 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4\": container with ID starting with 4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4 not found: ID does not exist" containerID="4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.366779 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4"} err="failed to get container status \"4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4\": rpc error: code = NotFound desc = could not find container \"4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4\": container with ID starting with 4c332e677c17319873c341c29f5426d47f8f46664dc9d21dcc5326c9d278d2e4 not found: ID does not exist" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.366825 4750 scope.go:117] "RemoveContainer" containerID="65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba" Oct 08 19:53:17 crc kubenswrapper[4750]: E1008 19:53:17.367581 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba\": container with ID starting with 65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba not found: ID does not exist" containerID="65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.367620 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba"} err="failed to get container status \"65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba\": rpc error: code = NotFound desc = could not find container \"65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba\": container with ID starting with 65a653b4c0870bd6823d2483fcc4684a6bed95cdc8d80c03f0f2ca7026a60dba not found: ID does not exist" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.367654 4750 scope.go:117] "RemoveContainer" containerID="380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512" Oct 08 19:53:17 crc kubenswrapper[4750]: E1008 19:53:17.368132 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512\": container with ID starting with 380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512 not found: ID does not exist" containerID="380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512" Oct 08 19:53:17 crc kubenswrapper[4750]: I1008 19:53:17.368158 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512"} err="failed to get container status \"380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512\": rpc error: code = NotFound desc = could not find container \"380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512\": container with ID starting with 380a1f9534c479dded229bfc0f0ac65d027d33e64c25f8af9fe9bc1c1e4eb512 not found: ID does not exist" Oct 08 19:53:18 crc kubenswrapper[4750]: I1008 19:53:18.749162 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a041c6-9181-488d-bc14-d46ec233f131" path="/var/lib/kubelet/pods/45a041c6-9181-488d-bc14-d46ec233f131/volumes" Oct 08 19:53:56 crc kubenswrapper[4750]: I1008 19:53:56.057353 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9bnqf"] Oct 08 19:53:56 crc kubenswrapper[4750]: I1008 19:53:56.069338 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9bnqf"] Oct 08 19:53:56 crc kubenswrapper[4750]: I1008 19:53:56.767637 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860a7d97-a324-43f6-a675-13a2c7ee4189" path="/var/lib/kubelet/pods/860a7d97-a324-43f6-a675-13a2c7ee4189/volumes" Oct 08 19:54:05 crc kubenswrapper[4750]: I1008 19:54:05.421626 4750 scope.go:117] "RemoveContainer" containerID="3af76fafbe6a9cd3fb5d157bfdc03a0c491293cf8e9b961d4c26a90a83e39f6a" Oct 08 19:54:05 crc kubenswrapper[4750]: I1008 19:54:05.483033 4750 scope.go:117] "RemoveContainer" containerID="41a3e7e12edb562fe1e15fc62faebf51789a62d568c396396e919b3fb68d9ced" Oct 08 19:54:07 crc kubenswrapper[4750]: I1008 19:54:07.061436 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-37e5-account-create-rgh78"] Oct 08 19:54:07 crc kubenswrapper[4750]: I1008 19:54:07.074606 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-37e5-account-create-rgh78"] Oct 08 19:54:08 crc kubenswrapper[4750]: I1008 19:54:08.745673 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0221949-25f4-467f-9b45-d06d2e0057c1" path="/var/lib/kubelet/pods/a0221949-25f4-467f-9b45-d06d2e0057c1/volumes" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.301219 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85658469b7-rh2sk"] Oct 08 19:54:10 crc kubenswrapper[4750]: E1008 19:54:10.302662 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="registry-server" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.302684 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="registry-server" Oct 08 19:54:10 crc kubenswrapper[4750]: E1008 19:54:10.302742 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="extract-utilities" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.302752 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="extract-utilities" Oct 08 19:54:10 crc kubenswrapper[4750]: E1008 19:54:10.302782 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="extract-content" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.302791 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="extract-content" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.303078 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a041c6-9181-488d-bc14-d46ec233f131" containerName="registry-server" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.313832 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.314152 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="67d6b591-2df8-40b3-8420-a20514693983" containerName="glance-log" containerID="cri-o://20994929e1b89554a51abf0b62233d6f842765fe7f942a6c9ef3f20cca9600ba" gracePeriod=30 Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.314361 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.316048 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="67d6b591-2df8-40b3-8420-a20514693983" containerName="glance-httpd" containerID="cri-o://7b3a0f3d51e5a8df0638b1b46f09c8aa6d87e23ec16a26de860a29a8f3f6c675" gracePeriod=30 Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.324447 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.324730 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-c4r7n" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.324860 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.329036 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.340390 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6c6j\" (UniqueName: \"kubernetes.io/projected/74f32caf-1070-436f-a3ee-a05ff03a9040-kube-api-access-x6c6j\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.340459 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-config-data\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.340506 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-scripts\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.340592 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f32caf-1070-436f-a3ee-a05ff03a9040-logs\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.340622 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74f32caf-1070-436f-a3ee-a05ff03a9040-horizon-secret-key\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.350736 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85658469b7-rh2sk"] Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.414897 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.415705 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerName="glance-log" containerID="cri-o://0d2add7b7120c41614ab5f8f0b06197e17ae4119e1741f351fc99786e2fbbdf3" gracePeriod=30 Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.416387 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerName="glance-httpd" containerID="cri-o://c95f2218282619d05a7047a6fb4d8e2a9e9ade9e2329f6cecc789c7fee5230c4" gracePeriod=30 Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.441730 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5ff4cc4c4c-mmttf"] Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.444057 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.445606 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6c6j\" (UniqueName: \"kubernetes.io/projected/74f32caf-1070-436f-a3ee-a05ff03a9040-kube-api-access-x6c6j\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.445655 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-config-data\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.445688 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-scripts\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.445727 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f32caf-1070-436f-a3ee-a05ff03a9040-logs\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.445748 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74f32caf-1070-436f-a3ee-a05ff03a9040-horizon-secret-key\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.447449 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-config-data\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.448858 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-scripts\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.449126 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f32caf-1070-436f-a3ee-a05ff03a9040-logs\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.461411 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74f32caf-1070-436f-a3ee-a05ff03a9040-horizon-secret-key\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.480314 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6c6j\" (UniqueName: \"kubernetes.io/projected/74f32caf-1070-436f-a3ee-a05ff03a9040-kube-api-access-x6c6j\") pod \"horizon-85658469b7-rh2sk\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.498989 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5ff4cc4c4c-mmttf"] Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.550003 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-scripts\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.550073 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9405e185-7c2c-41fb-93d6-66bb8dcd9833-horizon-secret-key\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.550128 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-config-data\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.550187 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9405e185-7c2c-41fb-93d6-66bb8dcd9833-logs\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.550220 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhbgk\" (UniqueName: \"kubernetes.io/projected/9405e185-7c2c-41fb-93d6-66bb8dcd9833-kube-api-access-vhbgk\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.644120 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.653688 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9405e185-7c2c-41fb-93d6-66bb8dcd9833-logs\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.653746 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhbgk\" (UniqueName: \"kubernetes.io/projected/9405e185-7c2c-41fb-93d6-66bb8dcd9833-kube-api-access-vhbgk\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.653837 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-scripts\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.653872 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9405e185-7c2c-41fb-93d6-66bb8dcd9833-horizon-secret-key\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.653919 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-config-data\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.654393 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9405e185-7c2c-41fb-93d6-66bb8dcd9833-logs\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.655149 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-config-data\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.655867 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-scripts\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.658025 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9405e185-7c2c-41fb-93d6-66bb8dcd9833-horizon-secret-key\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.684178 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhbgk\" (UniqueName: \"kubernetes.io/projected/9405e185-7c2c-41fb-93d6-66bb8dcd9833-kube-api-access-vhbgk\") pod \"horizon-5ff4cc4c4c-mmttf\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.902314 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.930027 4750 generic.go:334] "Generic (PLEG): container finished" podID="67d6b591-2df8-40b3-8420-a20514693983" containerID="20994929e1b89554a51abf0b62233d6f842765fe7f942a6c9ef3f20cca9600ba" exitCode=143 Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.930134 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67d6b591-2df8-40b3-8420-a20514693983","Type":"ContainerDied","Data":"20994929e1b89554a51abf0b62233d6f842765fe7f942a6c9ef3f20cca9600ba"} Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.934641 4750 generic.go:334] "Generic (PLEG): container finished" podID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerID="0d2add7b7120c41614ab5f8f0b06197e17ae4119e1741f351fc99786e2fbbdf3" exitCode=143 Oct 08 19:54:10 crc kubenswrapper[4750]: I1008 19:54:10.934755 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4bb76270-9267-460b-80fa-810d41aeb7fb","Type":"ContainerDied","Data":"0d2add7b7120c41614ab5f8f0b06197e17ae4119e1741f351fc99786e2fbbdf3"} Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.008005 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5ff4cc4c4c-mmttf"] Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.036679 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f5fb9b649-5qqwr"] Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.038699 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.058663 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f5fb9b649-5qqwr"] Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.067324 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-scripts\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.067856 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ade5e3-9ffe-4b69-b060-734b5db093e8-logs\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.068174 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-config-data\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.068305 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g64p5\" (UniqueName: \"kubernetes.io/projected/54ade5e3-9ffe-4b69-b060-734b5db093e8-kube-api-access-g64p5\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.068419 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54ade5e3-9ffe-4b69-b060-734b5db093e8-horizon-secret-key\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.169732 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-config-data\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.169787 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g64p5\" (UniqueName: \"kubernetes.io/projected/54ade5e3-9ffe-4b69-b060-734b5db093e8-kube-api-access-g64p5\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.169818 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54ade5e3-9ffe-4b69-b060-734b5db093e8-horizon-secret-key\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.169875 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-scripts\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.169895 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ade5e3-9ffe-4b69-b060-734b5db093e8-logs\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.170865 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ade5e3-9ffe-4b69-b060-734b5db093e8-logs\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.177253 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-config-data\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.178055 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-scripts\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.188480 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54ade5e3-9ffe-4b69-b060-734b5db093e8-horizon-secret-key\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.201048 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g64p5\" (UniqueName: \"kubernetes.io/projected/54ade5e3-9ffe-4b69-b060-734b5db093e8-kube-api-access-g64p5\") pod \"horizon-f5fb9b649-5qqwr\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.207678 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85658469b7-rh2sk"] Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.213385 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.371134 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.510883 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5ff4cc4c4c-mmttf"] Oct 08 19:54:11 crc kubenswrapper[4750]: W1008 19:54:11.525222 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9405e185_7c2c_41fb_93d6_66bb8dcd9833.slice/crio-9e540239a21bc7067754e5096647ee7afddb6942828dcb3429d9d357c0d5ce0b WatchSource:0}: Error finding container 9e540239a21bc7067754e5096647ee7afddb6942828dcb3429d9d357c0d5ce0b: Status 404 returned error can't find the container with id 9e540239a21bc7067754e5096647ee7afddb6942828dcb3429d9d357c0d5ce0b Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.882104 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f5fb9b649-5qqwr"] Oct 08 19:54:11 crc kubenswrapper[4750]: W1008 19:54:11.889998 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54ade5e3_9ffe_4b69_b060_734b5db093e8.slice/crio-31fa5ca3756638d52ce5029fe8a1f6ea96eef0ce6712f40674d77909571123a0 WatchSource:0}: Error finding container 31fa5ca3756638d52ce5029fe8a1f6ea96eef0ce6712f40674d77909571123a0: Status 404 returned error can't find the container with id 31fa5ca3756638d52ce5029fe8a1f6ea96eef0ce6712f40674d77909571123a0 Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.948800 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5fb9b649-5qqwr" event={"ID":"54ade5e3-9ffe-4b69-b060-734b5db093e8","Type":"ContainerStarted","Data":"31fa5ca3756638d52ce5029fe8a1f6ea96eef0ce6712f40674d77909571123a0"} Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.950504 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85658469b7-rh2sk" event={"ID":"74f32caf-1070-436f-a3ee-a05ff03a9040","Type":"ContainerStarted","Data":"28710c9b135b16929c8ff8eefb7cd0f97fc323f2b83d543df90bfca9ffb180b4"} Oct 08 19:54:11 crc kubenswrapper[4750]: I1008 19:54:11.953038 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5ff4cc4c4c-mmttf" event={"ID":"9405e185-7c2c-41fb-93d6-66bb8dcd9833","Type":"ContainerStarted","Data":"9e540239a21bc7067754e5096647ee7afddb6942828dcb3429d9d357c0d5ce0b"} Oct 08 19:54:13 crc kubenswrapper[4750]: I1008 19:54:13.985184 4750 generic.go:334] "Generic (PLEG): container finished" podID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerID="c95f2218282619d05a7047a6fb4d8e2a9e9ade9e2329f6cecc789c7fee5230c4" exitCode=0 Oct 08 19:54:13 crc kubenswrapper[4750]: I1008 19:54:13.985673 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4bb76270-9267-460b-80fa-810d41aeb7fb","Type":"ContainerDied","Data":"c95f2218282619d05a7047a6fb4d8e2a9e9ade9e2329f6cecc789c7fee5230c4"} Oct 08 19:54:13 crc kubenswrapper[4750]: I1008 19:54:13.992416 4750 generic.go:334] "Generic (PLEG): container finished" podID="67d6b591-2df8-40b3-8420-a20514693983" containerID="7b3a0f3d51e5a8df0638b1b46f09c8aa6d87e23ec16a26de860a29a8f3f6c675" exitCode=0 Oct 08 19:54:13 crc kubenswrapper[4750]: I1008 19:54:13.992490 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67d6b591-2df8-40b3-8420-a20514693983","Type":"ContainerDied","Data":"7b3a0f3d51e5a8df0638b1b46f09c8aa6d87e23ec16a26de860a29a8f3f6c675"} Oct 08 19:54:15 crc kubenswrapper[4750]: I1008 19:54:15.041945 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vkmhg"] Oct 08 19:54:15 crc kubenswrapper[4750]: I1008 19:54:15.056308 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vkmhg"] Oct 08 19:54:16 crc kubenswrapper[4750]: I1008 19:54:16.748180 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e16b629-352a-4ef2-b318-5a342165dfef" path="/var/lib/kubelet/pods/2e16b629-352a-4ef2-b318-5a342165dfef/volumes" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.466597 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.479516 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.547799 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-combined-ca-bundle\") pod \"4bb76270-9267-460b-80fa-810d41aeb7fb\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548197 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-combined-ca-bundle\") pod \"67d6b591-2df8-40b3-8420-a20514693983\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548266 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5mrx\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-kube-api-access-g5mrx\") pod \"4bb76270-9267-460b-80fa-810d41aeb7fb\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548298 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgl7s\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-kube-api-access-jgl7s\") pod \"67d6b591-2df8-40b3-8420-a20514693983\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548323 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-scripts\") pod \"4bb76270-9267-460b-80fa-810d41aeb7fb\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548360 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-config-data\") pod \"67d6b591-2df8-40b3-8420-a20514693983\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548408 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-httpd-run\") pod \"67d6b591-2df8-40b3-8420-a20514693983\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548566 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-config-data\") pod \"4bb76270-9267-460b-80fa-810d41aeb7fb\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548619 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-logs\") pod \"67d6b591-2df8-40b3-8420-a20514693983\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548719 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-scripts\") pod \"67d6b591-2df8-40b3-8420-a20514693983\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548804 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-ceph\") pod \"67d6b591-2df8-40b3-8420-a20514693983\" (UID: \"67d6b591-2df8-40b3-8420-a20514693983\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548825 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-logs\") pod \"4bb76270-9267-460b-80fa-810d41aeb7fb\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548864 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-ceph\") pod \"4bb76270-9267-460b-80fa-810d41aeb7fb\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.548927 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-httpd-run\") pod \"4bb76270-9267-460b-80fa-810d41aeb7fb\" (UID: \"4bb76270-9267-460b-80fa-810d41aeb7fb\") " Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.549806 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "67d6b591-2df8-40b3-8420-a20514693983" (UID: "67d6b591-2df8-40b3-8420-a20514693983"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.550748 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.557142 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-logs" (OuterVolumeSpecName: "logs") pod "67d6b591-2df8-40b3-8420-a20514693983" (UID: "67d6b591-2df8-40b3-8420-a20514693983"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.561940 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4bb76270-9267-460b-80fa-810d41aeb7fb" (UID: "4bb76270-9267-460b-80fa-810d41aeb7fb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.563158 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-logs" (OuterVolumeSpecName: "logs") pod "4bb76270-9267-460b-80fa-810d41aeb7fb" (UID: "4bb76270-9267-460b-80fa-810d41aeb7fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.564102 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-kube-api-access-jgl7s" (OuterVolumeSpecName: "kube-api-access-jgl7s") pod "67d6b591-2df8-40b3-8420-a20514693983" (UID: "67d6b591-2df8-40b3-8420-a20514693983"). InnerVolumeSpecName "kube-api-access-jgl7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.567317 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-scripts" (OuterVolumeSpecName: "scripts") pod "4bb76270-9267-460b-80fa-810d41aeb7fb" (UID: "4bb76270-9267-460b-80fa-810d41aeb7fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.567927 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-scripts" (OuterVolumeSpecName: "scripts") pod "67d6b591-2df8-40b3-8420-a20514693983" (UID: "67d6b591-2df8-40b3-8420-a20514693983"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.570132 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-ceph" (OuterVolumeSpecName: "ceph") pod "67d6b591-2df8-40b3-8420-a20514693983" (UID: "67d6b591-2df8-40b3-8420-a20514693983"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.575783 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-ceph" (OuterVolumeSpecName: "ceph") pod "4bb76270-9267-460b-80fa-810d41aeb7fb" (UID: "4bb76270-9267-460b-80fa-810d41aeb7fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.590522 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-kube-api-access-g5mrx" (OuterVolumeSpecName: "kube-api-access-g5mrx") pod "4bb76270-9267-460b-80fa-810d41aeb7fb" (UID: "4bb76270-9267-460b-80fa-810d41aeb7fb"). InnerVolumeSpecName "kube-api-access-g5mrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.605056 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bb76270-9267-460b-80fa-810d41aeb7fb" (UID: "4bb76270-9267-460b-80fa-810d41aeb7fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.634864 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67d6b591-2df8-40b3-8420-a20514693983" (UID: "67d6b591-2df8-40b3-8420-a20514693983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.637614 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-config-data" (OuterVolumeSpecName: "config-data") pod "4bb76270-9267-460b-80fa-810d41aeb7fb" (UID: "4bb76270-9267-460b-80fa-810d41aeb7fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653404 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653443 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d6b591-2df8-40b3-8420-a20514693983-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653455 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653465 4750 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653476 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653485 4750 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-ceph\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653500 4750 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bb76270-9267-460b-80fa-810d41aeb7fb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653511 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653560 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653573 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5mrx\" (UniqueName: \"kubernetes.io/projected/4bb76270-9267-460b-80fa-810d41aeb7fb-kube-api-access-g5mrx\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653585 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgl7s\" (UniqueName: \"kubernetes.io/projected/67d6b591-2df8-40b3-8420-a20514693983-kube-api-access-jgl7s\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.653600 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb76270-9267-460b-80fa-810d41aeb7fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.680337 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-config-data" (OuterVolumeSpecName: "config-data") pod "67d6b591-2df8-40b3-8420-a20514693983" (UID: "67d6b591-2df8-40b3-8420-a20514693983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:54:19 crc kubenswrapper[4750]: I1008 19:54:19.754392 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d6b591-2df8-40b3-8420-a20514693983-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.056584 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5ff4cc4c4c-mmttf" event={"ID":"9405e185-7c2c-41fb-93d6-66bb8dcd9833","Type":"ContainerStarted","Data":"57ca6b1d6b3b66f7cab62d24e76d285ae92c6dd9c7f98e3b808d31f4e53a3192"} Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.058094 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5fb9b649-5qqwr" event={"ID":"54ade5e3-9ffe-4b69-b060-734b5db093e8","Type":"ContainerStarted","Data":"87ed84e207c8ee0176b27ca029277a7c433695fb11ba21ab9909d99f324072bb"} Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.059878 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85658469b7-rh2sk" event={"ID":"74f32caf-1070-436f-a3ee-a05ff03a9040","Type":"ContainerStarted","Data":"3e228801beb9753c1caa344c3eaec38d8e1a6e81c93a6ad8d4bfb101d1cbfb74"} Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.059913 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85658469b7-rh2sk" event={"ID":"74f32caf-1070-436f-a3ee-a05ff03a9040","Type":"ContainerStarted","Data":"c6e5ff61c0426b7591a31211ca85580e1cc288deb65add3a05fbb63c4413c8f0"} Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.069509 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4bb76270-9267-460b-80fa-810d41aeb7fb","Type":"ContainerDied","Data":"83825ab29d79bd312410b7f70af3b1b89a1685b23c923142773371f95917db98"} Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.069600 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.069641 4750 scope.go:117] "RemoveContainer" containerID="c95f2218282619d05a7047a6fb4d8e2a9e9ade9e2329f6cecc789c7fee5230c4" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.075530 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67d6b591-2df8-40b3-8420-a20514693983","Type":"ContainerDied","Data":"0718d318a6d97b888769dc756330838af4232356e416cab39896aacebfdebf05"} Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.075669 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.091070 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f5fb9b649-5qqwr" podStartSLOduration=1.558328409 podStartE2EDuration="9.091040544s" podCreationTimestamp="2025-10-08 19:54:11 +0000 UTC" firstStartedPulling="2025-10-08 19:54:11.892951753 +0000 UTC m=+6207.805922776" lastFinishedPulling="2025-10-08 19:54:19.425663898 +0000 UTC m=+6215.338634911" observedRunningTime="2025-10-08 19:54:20.088921151 +0000 UTC m=+6216.001892164" watchObservedRunningTime="2025-10-08 19:54:20.091040544 +0000 UTC m=+6216.004011577" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.110783 4750 scope.go:117] "RemoveContainer" containerID="0d2add7b7120c41614ab5f8f0b06197e17ae4119e1741f351fc99786e2fbbdf3" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.138667 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85658469b7-rh2sk" podStartSLOduration=1.9260965570000002 podStartE2EDuration="10.138629477s" podCreationTimestamp="2025-10-08 19:54:10 +0000 UTC" firstStartedPulling="2025-10-08 19:54:11.213136518 +0000 UTC m=+6207.126107531" lastFinishedPulling="2025-10-08 19:54:19.425669428 +0000 UTC m=+6215.338640451" observedRunningTime="2025-10-08 19:54:20.131636283 +0000 UTC m=+6216.044607316" watchObservedRunningTime="2025-10-08 19:54:20.138629477 +0000 UTC m=+6216.051600510" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.144515 4750 scope.go:117] "RemoveContainer" containerID="7b3a0f3d51e5a8df0638b1b46f09c8aa6d87e23ec16a26de860a29a8f3f6c675" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.194760 4750 scope.go:117] "RemoveContainer" containerID="20994929e1b89554a51abf0b62233d6f842765fe7f942a6c9ef3f20cca9600ba" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.206799 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.218776 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.231158 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.258448 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.274368 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:54:20 crc kubenswrapper[4750]: E1008 19:54:20.277524 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d6b591-2df8-40b3-8420-a20514693983" containerName="glance-log" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.277586 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d6b591-2df8-40b3-8420-a20514693983" containerName="glance-log" Oct 08 19:54:20 crc kubenswrapper[4750]: E1008 19:54:20.277604 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerName="glance-httpd" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.277612 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerName="glance-httpd" Oct 08 19:54:20 crc kubenswrapper[4750]: E1008 19:54:20.277642 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d6b591-2df8-40b3-8420-a20514693983" containerName="glance-httpd" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.277649 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d6b591-2df8-40b3-8420-a20514693983" containerName="glance-httpd" Oct 08 19:54:20 crc kubenswrapper[4750]: E1008 19:54:20.277664 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerName="glance-log" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.277670 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerName="glance-log" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.277863 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d6b591-2df8-40b3-8420-a20514693983" containerName="glance-log" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.277887 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d6b591-2df8-40b3-8420-a20514693983" containerName="glance-httpd" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.277899 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerName="glance-log" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.277911 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" containerName="glance-httpd" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.279223 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.286135 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.287065 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wp2zx" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.287492 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.287495 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.291007 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.293684 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.300162 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.319672 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.492247 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-ceph\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.493218 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.493362 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjjp\" (UniqueName: \"kubernetes.io/projected/0362b08c-e18c-45cc-a155-af3775390c3b-kube-api-access-zdjjp\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.493436 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.493603 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.493679 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0362b08c-e18c-45cc-a155-af3775390c3b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.493843 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0362b08c-e18c-45cc-a155-af3775390c3b-logs\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.493957 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-logs\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.494082 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.494106 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.494173 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0362b08c-e18c-45cc-a155-af3775390c3b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.494259 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.494489 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.494626 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gslqp\" (UniqueName: \"kubernetes.io/projected/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-kube-api-access-gslqp\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.597711 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0362b08c-e18c-45cc-a155-af3775390c3b-logs\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.597781 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-logs\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.597830 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.597858 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.597884 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0362b08c-e18c-45cc-a155-af3775390c3b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.597924 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.597975 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598035 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gslqp\" (UniqueName: \"kubernetes.io/projected/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-kube-api-access-gslqp\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598108 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-ceph\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598211 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598253 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjjp\" (UniqueName: \"kubernetes.io/projected/0362b08c-e18c-45cc-a155-af3775390c3b-kube-api-access-zdjjp\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598261 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0362b08c-e18c-45cc-a155-af3775390c3b-logs\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598287 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598349 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598387 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0362b08c-e18c-45cc-a155-af3775390c3b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598466 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-logs\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.598571 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0362b08c-e18c-45cc-a155-af3775390c3b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.599504 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.606799 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.607340 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.608964 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.614572 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.617011 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-ceph\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.622373 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0362b08c-e18c-45cc-a155-af3775390c3b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.623498 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.624700 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0362b08c-e18c-45cc-a155-af3775390c3b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.633212 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gslqp\" (UniqueName: \"kubernetes.io/projected/f94e86c5-8e70-4b6c-a8cb-6923e62968b0-kube-api-access-gslqp\") pod \"glance-default-external-api-0\" (UID: \"f94e86c5-8e70-4b6c-a8cb-6923e62968b0\") " pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.644477 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjjp\" (UniqueName: \"kubernetes.io/projected/0362b08c-e18c-45cc-a155-af3775390c3b-kube-api-access-zdjjp\") pod \"glance-default-internal-api-0\" (UID: \"0362b08c-e18c-45cc-a155-af3775390c3b\") " pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.644642 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.645667 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.691574 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.709467 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.753608 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb76270-9267-460b-80fa-810d41aeb7fb" path="/var/lib/kubelet/pods/4bb76270-9267-460b-80fa-810d41aeb7fb/volumes" Oct 08 19:54:20 crc kubenswrapper[4750]: I1008 19:54:20.754646 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d6b591-2df8-40b3-8420-a20514693983" path="/var/lib/kubelet/pods/67d6b591-2df8-40b3-8420-a20514693983/volumes" Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.096740 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5ff4cc4c4c-mmttf" event={"ID":"9405e185-7c2c-41fb-93d6-66bb8dcd9833","Type":"ContainerStarted","Data":"5f6adb792ae4783d735de26e5ba3aaa4f68435778fb205e26a92e37599d2999f"} Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.097311 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5ff4cc4c4c-mmttf" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerName="horizon-log" containerID="cri-o://57ca6b1d6b3b66f7cab62d24e76d285ae92c6dd9c7f98e3b808d31f4e53a3192" gracePeriod=30 Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.098126 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5ff4cc4c4c-mmttf" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerName="horizon" containerID="cri-o://5f6adb792ae4783d735de26e5ba3aaa4f68435778fb205e26a92e37599d2999f" gracePeriod=30 Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.108174 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5fb9b649-5qqwr" event={"ID":"54ade5e3-9ffe-4b69-b060-734b5db093e8","Type":"ContainerStarted","Data":"ceecb5fabfb571fbbeded2c97cc653ed3571f4783fa7da8ab8dd63a1fddb003f"} Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.130052 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5ff4cc4c4c-mmttf" podStartSLOduration=3.165827865 podStartE2EDuration="11.13003204s" podCreationTimestamp="2025-10-08 19:54:10 +0000 UTC" firstStartedPulling="2025-10-08 19:54:11.542792406 +0000 UTC m=+6207.455763429" lastFinishedPulling="2025-10-08 19:54:19.506996551 +0000 UTC m=+6215.419967604" observedRunningTime="2025-10-08 19:54:21.123472077 +0000 UTC m=+6217.036443090" watchObservedRunningTime="2025-10-08 19:54:21.13003204 +0000 UTC m=+6217.043003053" Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.371706 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.371767 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.372031 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 19:54:21 crc kubenswrapper[4750]: I1008 19:54:21.496259 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 19:54:21 crc kubenswrapper[4750]: W1008 19:54:21.512008 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0362b08c_e18c_45cc_a155_af3775390c3b.slice/crio-3c4b30f592ea319d7f99ab934f316c4853ed347b6081573d14f5a4ad14ec7858 WatchSource:0}: Error finding container 3c4b30f592ea319d7f99ab934f316c4853ed347b6081573d14f5a4ad14ec7858: Status 404 returned error can't find the container with id 3c4b30f592ea319d7f99ab934f316c4853ed347b6081573d14f5a4ad14ec7858 Oct 08 19:54:22 crc kubenswrapper[4750]: I1008 19:54:22.153660 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f94e86c5-8e70-4b6c-a8cb-6923e62968b0","Type":"ContainerStarted","Data":"c14d698de0e721a9c6b87ac34a408cdd1da2c9743515ccca74d1983e1626e125"} Oct 08 19:54:22 crc kubenswrapper[4750]: I1008 19:54:22.156281 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0362b08c-e18c-45cc-a155-af3775390c3b","Type":"ContainerStarted","Data":"3c4b30f592ea319d7f99ab934f316c4853ed347b6081573d14f5a4ad14ec7858"} Oct 08 19:54:23 crc kubenswrapper[4750]: I1008 19:54:23.173399 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f94e86c5-8e70-4b6c-a8cb-6923e62968b0","Type":"ContainerStarted","Data":"cca6aa593651b2752abb8f8ad6b3c7a49a162ec191da2c189566ac6811226aff"} Oct 08 19:54:23 crc kubenswrapper[4750]: I1008 19:54:23.176607 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0362b08c-e18c-45cc-a155-af3775390c3b","Type":"ContainerStarted","Data":"42237dc23d03d1345a52ed281f8441a748c4a425aeedfae2f4e1dc6bfdce625a"} Oct 08 19:54:24 crc kubenswrapper[4750]: I1008 19:54:24.199478 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0362b08c-e18c-45cc-a155-af3775390c3b","Type":"ContainerStarted","Data":"6922f439485260bed5eadda1c0852e48b4394c549ff626321e6647f70444cc8e"} Oct 08 19:54:24 crc kubenswrapper[4750]: I1008 19:54:24.203677 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f94e86c5-8e70-4b6c-a8cb-6923e62968b0","Type":"ContainerStarted","Data":"b08e5a99bc933b3af704c47fd368a04600deb1ead48506bc4fb2541e4d7c6043"} Oct 08 19:54:24 crc kubenswrapper[4750]: I1008 19:54:24.238605 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.238574611 podStartE2EDuration="4.238574611s" podCreationTimestamp="2025-10-08 19:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:54:24.230942491 +0000 UTC m=+6220.143913554" watchObservedRunningTime="2025-10-08 19:54:24.238574611 +0000 UTC m=+6220.151545654" Oct 08 19:54:24 crc kubenswrapper[4750]: I1008 19:54:24.280891 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.280865892 podStartE2EDuration="4.280865892s" podCreationTimestamp="2025-10-08 19:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:54:24.265040679 +0000 UTC m=+6220.178011722" watchObservedRunningTime="2025-10-08 19:54:24.280865892 +0000 UTC m=+6220.193836915" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.648864 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85658469b7-rh2sk" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.694528 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.694931 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.712952 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.713044 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.753094 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.758971 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.759059 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.767066 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:30 crc kubenswrapper[4750]: I1008 19:54:30.906947 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:31 crc kubenswrapper[4750]: I1008 19:54:31.292011 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:31 crc kubenswrapper[4750]: I1008 19:54:31.292966 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:31 crc kubenswrapper[4750]: I1008 19:54:31.293056 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 19:54:31 crc kubenswrapper[4750]: I1008 19:54:31.293119 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 19:54:31 crc kubenswrapper[4750]: I1008 19:54:31.375337 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f5fb9b649-5qqwr" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.117:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8080: connect: connection refused" Oct 08 19:54:33 crc kubenswrapper[4750]: I1008 19:54:33.588105 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 19:54:33 crc kubenswrapper[4750]: I1008 19:54:33.588666 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 19:54:33 crc kubenswrapper[4750]: I1008 19:54:33.887325 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 19:54:34 crc kubenswrapper[4750]: I1008 19:54:34.153839 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:34 crc kubenswrapper[4750]: I1008 19:54:34.154449 4750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 19:54:34 crc kubenswrapper[4750]: I1008 19:54:34.175783 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 19:54:40 crc kubenswrapper[4750]: I1008 19:54:40.645843 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85658469b7-rh2sk" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 08 19:54:41 crc kubenswrapper[4750]: I1008 19:54:41.372408 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f5fb9b649-5qqwr" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.117:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8080: connect: connection refused" Oct 08 19:54:45 crc kubenswrapper[4750]: I1008 19:54:45.050350 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-pjwpg"] Oct 08 19:54:45 crc kubenswrapper[4750]: I1008 19:54:45.061198 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-pjwpg"] Oct 08 19:54:46 crc kubenswrapper[4750]: I1008 19:54:46.749021 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7698a080-0919-4c32-af78-fc68c8366657" path="/var/lib/kubelet/pods/7698a080-0919-4c32-af78-fc68c8366657/volumes" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.537817 4750 generic.go:334] "Generic (PLEG): container finished" podID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerID="5f6adb792ae4783d735de26e5ba3aaa4f68435778fb205e26a92e37599d2999f" exitCode=137 Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.538332 4750 generic.go:334] "Generic (PLEG): container finished" podID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerID="57ca6b1d6b3b66f7cab62d24e76d285ae92c6dd9c7f98e3b808d31f4e53a3192" exitCode=137 Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.537908 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5ff4cc4c4c-mmttf" event={"ID":"9405e185-7c2c-41fb-93d6-66bb8dcd9833","Type":"ContainerDied","Data":"5f6adb792ae4783d735de26e5ba3aaa4f68435778fb205e26a92e37599d2999f"} Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.538381 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5ff4cc4c4c-mmttf" event={"ID":"9405e185-7c2c-41fb-93d6-66bb8dcd9833","Type":"ContainerDied","Data":"57ca6b1d6b3b66f7cab62d24e76d285ae92c6dd9c7f98e3b808d31f4e53a3192"} Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.675622 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.842101 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-scripts\") pod \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.842158 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9405e185-7c2c-41fb-93d6-66bb8dcd9833-horizon-secret-key\") pod \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.843292 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhbgk\" (UniqueName: \"kubernetes.io/projected/9405e185-7c2c-41fb-93d6-66bb8dcd9833-kube-api-access-vhbgk\") pod \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.843415 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-config-data\") pod \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.843643 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9405e185-7c2c-41fb-93d6-66bb8dcd9833-logs\") pod \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\" (UID: \"9405e185-7c2c-41fb-93d6-66bb8dcd9833\") " Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.845457 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9405e185-7c2c-41fb-93d6-66bb8dcd9833-logs" (OuterVolumeSpecName: "logs") pod "9405e185-7c2c-41fb-93d6-66bb8dcd9833" (UID: "9405e185-7c2c-41fb-93d6-66bb8dcd9833"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.849340 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9405e185-7c2c-41fb-93d6-66bb8dcd9833-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9405e185-7c2c-41fb-93d6-66bb8dcd9833" (UID: "9405e185-7c2c-41fb-93d6-66bb8dcd9833"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.858887 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9405e185-7c2c-41fb-93d6-66bb8dcd9833-kube-api-access-vhbgk" (OuterVolumeSpecName: "kube-api-access-vhbgk") pod "9405e185-7c2c-41fb-93d6-66bb8dcd9833" (UID: "9405e185-7c2c-41fb-93d6-66bb8dcd9833"). InnerVolumeSpecName "kube-api-access-vhbgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.883697 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-scripts" (OuterVolumeSpecName: "scripts") pod "9405e185-7c2c-41fb-93d6-66bb8dcd9833" (UID: "9405e185-7c2c-41fb-93d6-66bb8dcd9833"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.896485 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-config-data" (OuterVolumeSpecName: "config-data") pod "9405e185-7c2c-41fb-93d6-66bb8dcd9833" (UID: "9405e185-7c2c-41fb-93d6-66bb8dcd9833"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.945737 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9405e185-7c2c-41fb-93d6-66bb8dcd9833-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.945773 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.945782 4750 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9405e185-7c2c-41fb-93d6-66bb8dcd9833-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.945795 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhbgk\" (UniqueName: \"kubernetes.io/projected/9405e185-7c2c-41fb-93d6-66bb8dcd9833-kube-api-access-vhbgk\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:51 crc kubenswrapper[4750]: I1008 19:54:51.945804 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9405e185-7c2c-41fb-93d6-66bb8dcd9833-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:54:52 crc kubenswrapper[4750]: I1008 19:54:52.552445 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5ff4cc4c4c-mmttf" event={"ID":"9405e185-7c2c-41fb-93d6-66bb8dcd9833","Type":"ContainerDied","Data":"9e540239a21bc7067754e5096647ee7afddb6942828dcb3429d9d357c0d5ce0b"} Oct 08 19:54:52 crc kubenswrapper[4750]: I1008 19:54:52.552716 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5ff4cc4c4c-mmttf" Oct 08 19:54:52 crc kubenswrapper[4750]: I1008 19:54:52.553727 4750 scope.go:117] "RemoveContainer" containerID="5f6adb792ae4783d735de26e5ba3aaa4f68435778fb205e26a92e37599d2999f" Oct 08 19:54:52 crc kubenswrapper[4750]: I1008 19:54:52.620283 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5ff4cc4c4c-mmttf"] Oct 08 19:54:52 crc kubenswrapper[4750]: I1008 19:54:52.635648 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5ff4cc4c4c-mmttf"] Oct 08 19:54:52 crc kubenswrapper[4750]: I1008 19:54:52.755375 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" path="/var/lib/kubelet/pods/9405e185-7c2c-41fb-93d6-66bb8dcd9833/volumes" Oct 08 19:54:52 crc kubenswrapper[4750]: I1008 19:54:52.765218 4750 scope.go:117] "RemoveContainer" containerID="57ca6b1d6b3b66f7cab62d24e76d285ae92c6dd9c7f98e3b808d31f4e53a3192" Oct 08 19:54:52 crc kubenswrapper[4750]: I1008 19:54:52.908359 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:53 crc kubenswrapper[4750]: I1008 19:54:53.633889 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:54 crc kubenswrapper[4750]: I1008 19:54:54.549364 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:54:55 crc kubenswrapper[4750]: I1008 19:54:55.064770 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d0fe-account-create-rj8v8"] Oct 08 19:54:55 crc kubenswrapper[4750]: I1008 19:54:55.096069 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d0fe-account-create-rj8v8"] Oct 08 19:54:55 crc kubenswrapper[4750]: I1008 19:54:55.728128 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:54:55 crc kubenswrapper[4750]: I1008 19:54:55.814115 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85658469b7-rh2sk"] Oct 08 19:54:55 crc kubenswrapper[4750]: I1008 19:54:55.814818 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85658469b7-rh2sk" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon-log" containerID="cri-o://c6e5ff61c0426b7591a31211ca85580e1cc288deb65add3a05fbb63c4413c8f0" gracePeriod=30 Oct 08 19:54:55 crc kubenswrapper[4750]: I1008 19:54:55.815300 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85658469b7-rh2sk" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" containerID="cri-o://3e228801beb9753c1caa344c3eaec38d8e1a6e81c93a6ad8d4bfb101d1cbfb74" gracePeriod=30 Oct 08 19:54:56 crc kubenswrapper[4750]: I1008 19:54:56.748344 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b608fd-2863-4703-8c27-d35888393aeb" path="/var/lib/kubelet/pods/a5b608fd-2863-4703-8c27-d35888393aeb/volumes" Oct 08 19:54:59 crc kubenswrapper[4750]: I1008 19:54:59.657384 4750 generic.go:334] "Generic (PLEG): container finished" podID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerID="3e228801beb9753c1caa344c3eaec38d8e1a6e81c93a6ad8d4bfb101d1cbfb74" exitCode=0 Oct 08 19:54:59 crc kubenswrapper[4750]: I1008 19:54:59.657499 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85658469b7-rh2sk" event={"ID":"74f32caf-1070-436f-a3ee-a05ff03a9040","Type":"ContainerDied","Data":"3e228801beb9753c1caa344c3eaec38d8e1a6e81c93a6ad8d4bfb101d1cbfb74"} Oct 08 19:55:00 crc kubenswrapper[4750]: I1008 19:55:00.645730 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85658469b7-rh2sk" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 08 19:55:02 crc kubenswrapper[4750]: I1008 19:55:02.057240 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-69hsp"] Oct 08 19:55:02 crc kubenswrapper[4750]: I1008 19:55:02.070084 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-69hsp"] Oct 08 19:55:02 crc kubenswrapper[4750]: I1008 19:55:02.748853 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee" path="/var/lib/kubelet/pods/aadb2ce3-1c9b-4af1-b4ad-3e310b2778ee/volumes" Oct 08 19:55:05 crc kubenswrapper[4750]: I1008 19:55:05.656584 4750 scope.go:117] "RemoveContainer" containerID="39994094c13130c8f2f1ef3a4a7bb987fa44919de60ae130958a98428ffe21d6" Oct 08 19:55:05 crc kubenswrapper[4750]: I1008 19:55:05.689611 4750 scope.go:117] "RemoveContainer" containerID="05c29c0c6dce72fe94e80fbd924aee24534ee0f5b345fe31cae06418b982fa28" Oct 08 19:55:05 crc kubenswrapper[4750]: I1008 19:55:05.754826 4750 scope.go:117] "RemoveContainer" containerID="b1ea89a72a627d646a4a35429444710d584da64d9074c7d079cb546b47a28223" Oct 08 19:55:05 crc kubenswrapper[4750]: I1008 19:55:05.816854 4750 scope.go:117] "RemoveContainer" containerID="2e25d2a914f228a8b6476c01c56486648c14db30e51db7a10c490d676f3928e0" Oct 08 19:55:05 crc kubenswrapper[4750]: I1008 19:55:05.902114 4750 scope.go:117] "RemoveContainer" containerID="877da3de27e88bdb0a05e13b6550d49dd3a15973b1604f80d0fd3a6f246d0243" Oct 08 19:55:10 crc kubenswrapper[4750]: I1008 19:55:10.645865 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85658469b7-rh2sk" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 08 19:55:20 crc kubenswrapper[4750]: I1008 19:55:20.644960 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85658469b7-rh2sk" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 08 19:55:20 crc kubenswrapper[4750]: I1008 19:55:20.645815 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.003151 4750 generic.go:334] "Generic (PLEG): container finished" podID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerID="c6e5ff61c0426b7591a31211ca85580e1cc288deb65add3a05fbb63c4413c8f0" exitCode=137 Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.003247 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85658469b7-rh2sk" event={"ID":"74f32caf-1070-436f-a3ee-a05ff03a9040","Type":"ContainerDied","Data":"c6e5ff61c0426b7591a31211ca85580e1cc288deb65add3a05fbb63c4413c8f0"} Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.317738 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.441524 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6c6j\" (UniqueName: \"kubernetes.io/projected/74f32caf-1070-436f-a3ee-a05ff03a9040-kube-api-access-x6c6j\") pod \"74f32caf-1070-436f-a3ee-a05ff03a9040\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.441671 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-config-data\") pod \"74f32caf-1070-436f-a3ee-a05ff03a9040\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.441776 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f32caf-1070-436f-a3ee-a05ff03a9040-logs\") pod \"74f32caf-1070-436f-a3ee-a05ff03a9040\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.441944 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-scripts\") pod \"74f32caf-1070-436f-a3ee-a05ff03a9040\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.442025 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74f32caf-1070-436f-a3ee-a05ff03a9040-horizon-secret-key\") pod \"74f32caf-1070-436f-a3ee-a05ff03a9040\" (UID: \"74f32caf-1070-436f-a3ee-a05ff03a9040\") " Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.443421 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74f32caf-1070-436f-a3ee-a05ff03a9040-logs" (OuterVolumeSpecName: "logs") pod "74f32caf-1070-436f-a3ee-a05ff03a9040" (UID: "74f32caf-1070-436f-a3ee-a05ff03a9040"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.449473 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f32caf-1070-436f-a3ee-a05ff03a9040-kube-api-access-x6c6j" (OuterVolumeSpecName: "kube-api-access-x6c6j") pod "74f32caf-1070-436f-a3ee-a05ff03a9040" (UID: "74f32caf-1070-436f-a3ee-a05ff03a9040"). InnerVolumeSpecName "kube-api-access-x6c6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.450104 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f32caf-1070-436f-a3ee-a05ff03a9040-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "74f32caf-1070-436f-a3ee-a05ff03a9040" (UID: "74f32caf-1070-436f-a3ee-a05ff03a9040"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.482489 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-scripts" (OuterVolumeSpecName: "scripts") pod "74f32caf-1070-436f-a3ee-a05ff03a9040" (UID: "74f32caf-1070-436f-a3ee-a05ff03a9040"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.501692 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-config-data" (OuterVolumeSpecName: "config-data") pod "74f32caf-1070-436f-a3ee-a05ff03a9040" (UID: "74f32caf-1070-436f-a3ee-a05ff03a9040"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.545401 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6c6j\" (UniqueName: \"kubernetes.io/projected/74f32caf-1070-436f-a3ee-a05ff03a9040-kube-api-access-x6c6j\") on node \"crc\" DevicePath \"\"" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.545447 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.545459 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f32caf-1070-436f-a3ee-a05ff03a9040-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.545469 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f32caf-1070-436f-a3ee-a05ff03a9040-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:55:26 crc kubenswrapper[4750]: I1008 19:55:26.545481 4750 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/74f32caf-1070-436f-a3ee-a05ff03a9040-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 19:55:27 crc kubenswrapper[4750]: I1008 19:55:27.018186 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85658469b7-rh2sk" event={"ID":"74f32caf-1070-436f-a3ee-a05ff03a9040","Type":"ContainerDied","Data":"28710c9b135b16929c8ff8eefb7cd0f97fc323f2b83d543df90bfca9ffb180b4"} Oct 08 19:55:27 crc kubenswrapper[4750]: I1008 19:55:27.018243 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85658469b7-rh2sk" Oct 08 19:55:27 crc kubenswrapper[4750]: I1008 19:55:27.020237 4750 scope.go:117] "RemoveContainer" containerID="3e228801beb9753c1caa344c3eaec38d8e1a6e81c93a6ad8d4bfb101d1cbfb74" Oct 08 19:55:27 crc kubenswrapper[4750]: I1008 19:55:27.048258 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85658469b7-rh2sk"] Oct 08 19:55:27 crc kubenswrapper[4750]: I1008 19:55:27.056475 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-85658469b7-rh2sk"] Oct 08 19:55:27 crc kubenswrapper[4750]: I1008 19:55:27.231649 4750 scope.go:117] "RemoveContainer" containerID="c6e5ff61c0426b7591a31211ca85580e1cc288deb65add3a05fbb63c4413c8f0" Oct 08 19:55:28 crc kubenswrapper[4750]: I1008 19:55:28.752839 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" path="/var/lib/kubelet/pods/74f32caf-1070-436f-a3ee-a05ff03a9040/volumes" Oct 08 19:55:29 crc kubenswrapper[4750]: I1008 19:55:29.707252 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:55:29 crc kubenswrapper[4750]: I1008 19:55:29.707346 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:55:59 crc kubenswrapper[4750]: I1008 19:55:59.707517 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:55:59 crc kubenswrapper[4750]: I1008 19:55:59.708381 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:56:02 crc kubenswrapper[4750]: I1008 19:56:02.074446 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-sk9r7"] Oct 08 19:56:02 crc kubenswrapper[4750]: I1008 19:56:02.083344 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gxx8b"] Oct 08 19:56:02 crc kubenswrapper[4750]: I1008 19:56:02.091116 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gxx8b"] Oct 08 19:56:02 crc kubenswrapper[4750]: I1008 19:56:02.099300 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-sk9r7"] Oct 08 19:56:02 crc kubenswrapper[4750]: I1008 19:56:02.754953 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f76397-084b-4a8b-ba05-8e8f819fcc7f" path="/var/lib/kubelet/pods/13f76397-084b-4a8b-ba05-8e8f819fcc7f/volumes" Oct 08 19:56:02 crc kubenswrapper[4750]: I1008 19:56:02.755963 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8288f7a-ec5f-40eb-bff9-25d97137f1b1" path="/var/lib/kubelet/pods/d8288f7a-ec5f-40eb-bff9-25d97137f1b1/volumes" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.036349 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9flkn"] Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.051685 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9flkn"] Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.275424 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b7fc4d7fc-bhdm6"] Oct 08 19:56:03 crc kubenswrapper[4750]: E1008 19:56:03.276134 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon-log" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.276162 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon-log" Oct 08 19:56:03 crc kubenswrapper[4750]: E1008 19:56:03.276204 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerName="horizon" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.276217 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerName="horizon" Oct 08 19:56:03 crc kubenswrapper[4750]: E1008 19:56:03.276244 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerName="horizon-log" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.276257 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerName="horizon-log" Oct 08 19:56:03 crc kubenswrapper[4750]: E1008 19:56:03.276331 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.276344 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.277057 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.277075 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerName="horizon-log" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.277100 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f32caf-1070-436f-a3ee-a05ff03a9040" containerName="horizon-log" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.277117 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="9405e185-7c2c-41fb-93d6-66bb8dcd9833" containerName="horizon" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.278324 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.291209 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b7fc4d7fc-bhdm6"] Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.398331 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9530c511-132e-4075-910e-a8e6606fe282-scripts\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.398545 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9530c511-132e-4075-910e-a8e6606fe282-config-data\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.398844 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlhg\" (UniqueName: \"kubernetes.io/projected/9530c511-132e-4075-910e-a8e6606fe282-kube-api-access-7rlhg\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.398887 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9530c511-132e-4075-910e-a8e6606fe282-horizon-secret-key\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.399061 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9530c511-132e-4075-910e-a8e6606fe282-logs\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.500810 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlhg\" (UniqueName: \"kubernetes.io/projected/9530c511-132e-4075-910e-a8e6606fe282-kube-api-access-7rlhg\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.500872 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9530c511-132e-4075-910e-a8e6606fe282-horizon-secret-key\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.500927 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9530c511-132e-4075-910e-a8e6606fe282-logs\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.501035 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9530c511-132e-4075-910e-a8e6606fe282-scripts\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.501109 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9530c511-132e-4075-910e-a8e6606fe282-config-data\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.501998 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9530c511-132e-4075-910e-a8e6606fe282-logs\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.503106 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9530c511-132e-4075-910e-a8e6606fe282-config-data\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.503782 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9530c511-132e-4075-910e-a8e6606fe282-scripts\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.510446 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9530c511-132e-4075-910e-a8e6606fe282-horizon-secret-key\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.525246 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlhg\" (UniqueName: \"kubernetes.io/projected/9530c511-132e-4075-910e-a8e6606fe282-kube-api-access-7rlhg\") pod \"horizon-7b7fc4d7fc-bhdm6\" (UID: \"9530c511-132e-4075-910e-a8e6606fe282\") " pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:03 crc kubenswrapper[4750]: I1008 19:56:03.608017 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.125584 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b7fc4d7fc-bhdm6"] Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.494991 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b7fc4d7fc-bhdm6" event={"ID":"9530c511-132e-4075-910e-a8e6606fe282","Type":"ContainerStarted","Data":"f3c2d0a33043d3055f69e753fcaf679565a7907f3d8ac90fbba8d232c2a9217c"} Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.495048 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b7fc4d7fc-bhdm6" event={"ID":"9530c511-132e-4075-910e-a8e6606fe282","Type":"ContainerStarted","Data":"c899936d237ff808d69ad3b62c0752648fdc8b2385ea51352d35da1e7d02f778"} Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.495061 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b7fc4d7fc-bhdm6" event={"ID":"9530c511-132e-4075-910e-a8e6606fe282","Type":"ContainerStarted","Data":"c7bce95dc46456acb658916aefe24bba6c5b9540f4334eb10d8162ff2c5086c8"} Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.521318 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b7fc4d7fc-bhdm6" podStartSLOduration=1.521296722 podStartE2EDuration="1.521296722s" podCreationTimestamp="2025-10-08 19:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:56:04.516904273 +0000 UTC m=+6320.429875286" watchObservedRunningTime="2025-10-08 19:56:04.521296722 +0000 UTC m=+6320.434267735" Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.750312 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5bf2cb-af64-48d8-bad7-1f32371f7f55" path="/var/lib/kubelet/pods/7a5bf2cb-af64-48d8-bad7-1f32371f7f55/volumes" Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.768727 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-sxssl"] Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.770713 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-sxssl" Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.775794 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-sxssl"] Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.834128 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvrp\" (UniqueName: \"kubernetes.io/projected/082cee4e-0e68-4a13-86ec-8f2118bb14e3-kube-api-access-zmvrp\") pod \"heat-db-create-sxssl\" (UID: \"082cee4e-0e68-4a13-86ec-8f2118bb14e3\") " pod="openstack/heat-db-create-sxssl" Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.936670 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvrp\" (UniqueName: \"kubernetes.io/projected/082cee4e-0e68-4a13-86ec-8f2118bb14e3-kube-api-access-zmvrp\") pod \"heat-db-create-sxssl\" (UID: \"082cee4e-0e68-4a13-86ec-8f2118bb14e3\") " pod="openstack/heat-db-create-sxssl" Oct 08 19:56:04 crc kubenswrapper[4750]: I1008 19:56:04.965399 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvrp\" (UniqueName: \"kubernetes.io/projected/082cee4e-0e68-4a13-86ec-8f2118bb14e3-kube-api-access-zmvrp\") pod \"heat-db-create-sxssl\" (UID: \"082cee4e-0e68-4a13-86ec-8f2118bb14e3\") " pod="openstack/heat-db-create-sxssl" Oct 08 19:56:05 crc kubenswrapper[4750]: I1008 19:56:05.106643 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-sxssl" Oct 08 19:56:05 crc kubenswrapper[4750]: I1008 19:56:05.667087 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-sxssl"] Oct 08 19:56:06 crc kubenswrapper[4750]: I1008 19:56:06.164078 4750 scope.go:117] "RemoveContainer" containerID="d366321564a926e3aad97d53b52ac89757b5bae7dedda73a052135c8db69116c" Oct 08 19:56:06 crc kubenswrapper[4750]: I1008 19:56:06.204655 4750 scope.go:117] "RemoveContainer" containerID="7cc9359ec838553026bf6121708fa2ce6467821a82ed8550c0f6d28d4bd0ae14" Oct 08 19:56:06 crc kubenswrapper[4750]: I1008 19:56:06.246214 4750 scope.go:117] "RemoveContainer" containerID="d54612685bf264478b11d6f2878fbde935d91a8b0ce838303bce27b8c6261a15" Oct 08 19:56:06 crc kubenswrapper[4750]: I1008 19:56:06.528653 4750 generic.go:334] "Generic (PLEG): container finished" podID="082cee4e-0e68-4a13-86ec-8f2118bb14e3" containerID="05c9c0a3127e55228069fb71842668376fdc9e90cc7985cc7dea1aa293866035" exitCode=0 Oct 08 19:56:06 crc kubenswrapper[4750]: I1008 19:56:06.528715 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-sxssl" event={"ID":"082cee4e-0e68-4a13-86ec-8f2118bb14e3","Type":"ContainerDied","Data":"05c9c0a3127e55228069fb71842668376fdc9e90cc7985cc7dea1aa293866035"} Oct 08 19:56:06 crc kubenswrapper[4750]: I1008 19:56:06.528793 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-sxssl" event={"ID":"082cee4e-0e68-4a13-86ec-8f2118bb14e3","Type":"ContainerStarted","Data":"491007fe05766bd2c952a72eb42e22f6d67f5c3d46cfdfcc839120e1ae922006"} Oct 08 19:56:07 crc kubenswrapper[4750]: I1008 19:56:07.904688 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-sxssl" Oct 08 19:56:08 crc kubenswrapper[4750]: I1008 19:56:08.007717 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmvrp\" (UniqueName: \"kubernetes.io/projected/082cee4e-0e68-4a13-86ec-8f2118bb14e3-kube-api-access-zmvrp\") pod \"082cee4e-0e68-4a13-86ec-8f2118bb14e3\" (UID: \"082cee4e-0e68-4a13-86ec-8f2118bb14e3\") " Oct 08 19:56:08 crc kubenswrapper[4750]: I1008 19:56:08.016307 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082cee4e-0e68-4a13-86ec-8f2118bb14e3-kube-api-access-zmvrp" (OuterVolumeSpecName: "kube-api-access-zmvrp") pod "082cee4e-0e68-4a13-86ec-8f2118bb14e3" (UID: "082cee4e-0e68-4a13-86ec-8f2118bb14e3"). InnerVolumeSpecName "kube-api-access-zmvrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:56:08 crc kubenswrapper[4750]: I1008 19:56:08.111298 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmvrp\" (UniqueName: \"kubernetes.io/projected/082cee4e-0e68-4a13-86ec-8f2118bb14e3-kube-api-access-zmvrp\") on node \"crc\" DevicePath \"\"" Oct 08 19:56:08 crc kubenswrapper[4750]: I1008 19:56:08.553487 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-sxssl" event={"ID":"082cee4e-0e68-4a13-86ec-8f2118bb14e3","Type":"ContainerDied","Data":"491007fe05766bd2c952a72eb42e22f6d67f5c3d46cfdfcc839120e1ae922006"} Oct 08 19:56:08 crc kubenswrapper[4750]: I1008 19:56:08.553532 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="491007fe05766bd2c952a72eb42e22f6d67f5c3d46cfdfcc839120e1ae922006" Oct 08 19:56:08 crc kubenswrapper[4750]: I1008 19:56:08.553620 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-sxssl" Oct 08 19:56:13 crc kubenswrapper[4750]: I1008 19:56:13.609030 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:13 crc kubenswrapper[4750]: I1008 19:56:13.611022 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.041802 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d413-account-create-4crz5"] Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.058754 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d413-account-create-4crz5"] Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.070434 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9f8d-account-create-vlx88"] Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.080317 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4e16-account-create-t7vrg"] Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.091503 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9f8d-account-create-vlx88"] Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.103241 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4e16-account-create-t7vrg"] Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.771457 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36644d4b-9c0f-4d40-bea3-eab9bab01579" path="/var/lib/kubelet/pods/36644d4b-9c0f-4d40-bea3-eab9bab01579/volumes" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.772517 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82614568-a654-433f-8008-026d4e9b1951" path="/var/lib/kubelet/pods/82614568-a654-433f-8008-026d4e9b1951/volumes" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.774321 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7843c8-fa26-46fb-a193-ee70486938ce" path="/var/lib/kubelet/pods/8e7843c8-fa26-46fb-a193-ee70486938ce/volumes" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.913924 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-4b7c-account-create-l4mdz"] Oct 08 19:56:14 crc kubenswrapper[4750]: E1008 19:56:14.914948 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082cee4e-0e68-4a13-86ec-8f2118bb14e3" containerName="mariadb-database-create" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.914969 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="082cee4e-0e68-4a13-86ec-8f2118bb14e3" containerName="mariadb-database-create" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.915226 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="082cee4e-0e68-4a13-86ec-8f2118bb14e3" containerName="mariadb-database-create" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.916098 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4b7c-account-create-l4mdz" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.919848 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 08 19:56:14 crc kubenswrapper[4750]: I1008 19:56:14.953143 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4b7c-account-create-l4mdz"] Oct 08 19:56:15 crc kubenswrapper[4750]: I1008 19:56:15.084495 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfkhr\" (UniqueName: \"kubernetes.io/projected/97570253-a79c-4ede-a440-3592f37223ee-kube-api-access-xfkhr\") pod \"heat-4b7c-account-create-l4mdz\" (UID: \"97570253-a79c-4ede-a440-3592f37223ee\") " pod="openstack/heat-4b7c-account-create-l4mdz" Oct 08 19:56:15 crc kubenswrapper[4750]: I1008 19:56:15.190062 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfkhr\" (UniqueName: \"kubernetes.io/projected/97570253-a79c-4ede-a440-3592f37223ee-kube-api-access-xfkhr\") pod \"heat-4b7c-account-create-l4mdz\" (UID: \"97570253-a79c-4ede-a440-3592f37223ee\") " pod="openstack/heat-4b7c-account-create-l4mdz" Oct 08 19:56:15 crc kubenswrapper[4750]: I1008 19:56:15.223028 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfkhr\" (UniqueName: \"kubernetes.io/projected/97570253-a79c-4ede-a440-3592f37223ee-kube-api-access-xfkhr\") pod \"heat-4b7c-account-create-l4mdz\" (UID: \"97570253-a79c-4ede-a440-3592f37223ee\") " pod="openstack/heat-4b7c-account-create-l4mdz" Oct 08 19:56:15 crc kubenswrapper[4750]: I1008 19:56:15.273785 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4b7c-account-create-l4mdz" Oct 08 19:56:15 crc kubenswrapper[4750]: I1008 19:56:15.828344 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4b7c-account-create-l4mdz"] Oct 08 19:56:16 crc kubenswrapper[4750]: I1008 19:56:16.641506 4750 generic.go:334] "Generic (PLEG): container finished" podID="97570253-a79c-4ede-a440-3592f37223ee" containerID="3b1b97643fc8f00f936334ec7feec6c068e4eceb47fa4e69552de4fcbed60918" exitCode=0 Oct 08 19:56:16 crc kubenswrapper[4750]: I1008 19:56:16.641676 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4b7c-account-create-l4mdz" event={"ID":"97570253-a79c-4ede-a440-3592f37223ee","Type":"ContainerDied","Data":"3b1b97643fc8f00f936334ec7feec6c068e4eceb47fa4e69552de4fcbed60918"} Oct 08 19:56:16 crc kubenswrapper[4750]: I1008 19:56:16.642120 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4b7c-account-create-l4mdz" event={"ID":"97570253-a79c-4ede-a440-3592f37223ee","Type":"ContainerStarted","Data":"6eaa9e146e1d536e29644e9de05c1e62a5c92817146db1c87be598d16928baa6"} Oct 08 19:56:18 crc kubenswrapper[4750]: I1008 19:56:18.067057 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4b7c-account-create-l4mdz" Oct 08 19:56:18 crc kubenswrapper[4750]: I1008 19:56:18.175843 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfkhr\" (UniqueName: \"kubernetes.io/projected/97570253-a79c-4ede-a440-3592f37223ee-kube-api-access-xfkhr\") pod \"97570253-a79c-4ede-a440-3592f37223ee\" (UID: \"97570253-a79c-4ede-a440-3592f37223ee\") " Oct 08 19:56:18 crc kubenswrapper[4750]: I1008 19:56:18.182224 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97570253-a79c-4ede-a440-3592f37223ee-kube-api-access-xfkhr" (OuterVolumeSpecName: "kube-api-access-xfkhr") pod "97570253-a79c-4ede-a440-3592f37223ee" (UID: "97570253-a79c-4ede-a440-3592f37223ee"). InnerVolumeSpecName "kube-api-access-xfkhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:56:18 crc kubenswrapper[4750]: I1008 19:56:18.280042 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfkhr\" (UniqueName: \"kubernetes.io/projected/97570253-a79c-4ede-a440-3592f37223ee-kube-api-access-xfkhr\") on node \"crc\" DevicePath \"\"" Oct 08 19:56:18 crc kubenswrapper[4750]: I1008 19:56:18.671319 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4b7c-account-create-l4mdz" event={"ID":"97570253-a79c-4ede-a440-3592f37223ee","Type":"ContainerDied","Data":"6eaa9e146e1d536e29644e9de05c1e62a5c92817146db1c87be598d16928baa6"} Oct 08 19:56:18 crc kubenswrapper[4750]: I1008 19:56:18.671814 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eaa9e146e1d536e29644e9de05c1e62a5c92817146db1c87be598d16928baa6" Oct 08 19:56:18 crc kubenswrapper[4750]: I1008 19:56:18.671410 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4b7c-account-create-l4mdz" Oct 08 19:56:19 crc kubenswrapper[4750]: I1008 19:56:19.981052 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-w4h6x"] Oct 08 19:56:19 crc kubenswrapper[4750]: E1008 19:56:19.982310 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97570253-a79c-4ede-a440-3592f37223ee" containerName="mariadb-account-create" Oct 08 19:56:19 crc kubenswrapper[4750]: I1008 19:56:19.982332 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="97570253-a79c-4ede-a440-3592f37223ee" containerName="mariadb-account-create" Oct 08 19:56:19 crc kubenswrapper[4750]: I1008 19:56:19.982665 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="97570253-a79c-4ede-a440-3592f37223ee" containerName="mariadb-account-create" Oct 08 19:56:19 crc kubenswrapper[4750]: I1008 19:56:19.983735 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:19 crc kubenswrapper[4750]: I1008 19:56:19.986173 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-qtq8p" Oct 08 19:56:19 crc kubenswrapper[4750]: I1008 19:56:19.988076 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 08 19:56:19 crc kubenswrapper[4750]: I1008 19:56:19.994896 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-w4h6x"] Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.133019 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-combined-ca-bundle\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.133362 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-config-data\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.133531 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h948s\" (UniqueName: \"kubernetes.io/projected/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-kube-api-access-h948s\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.236287 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-combined-ca-bundle\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.236727 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-config-data\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.236892 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h948s\" (UniqueName: \"kubernetes.io/projected/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-kube-api-access-h948s\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.250919 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-combined-ca-bundle\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.251235 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-config-data\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.257786 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h948s\" (UniqueName: \"kubernetes.io/projected/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-kube-api-access-h948s\") pod \"heat-db-sync-w4h6x\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.315397 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:20 crc kubenswrapper[4750]: I1008 19:56:20.851397 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-w4h6x"] Oct 08 19:56:21 crc kubenswrapper[4750]: I1008 19:56:21.713511 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w4h6x" event={"ID":"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d","Type":"ContainerStarted","Data":"054568fc2b9766a2d58a218a7727cb44cc8b2867e240e7981936ab41f85cff5f"} Oct 08 19:56:23 crc kubenswrapper[4750]: I1008 19:56:23.043167 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdkkb"] Oct 08 19:56:23 crc kubenswrapper[4750]: I1008 19:56:23.055277 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cdkkb"] Oct 08 19:56:23 crc kubenswrapper[4750]: I1008 19:56:23.611444 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b7fc4d7fc-bhdm6" podUID="9530c511-132e-4075-910e-a8e6606fe282" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.120:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8080: connect: connection refused" Oct 08 19:56:24 crc kubenswrapper[4750]: I1008 19:56:24.761884 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6024d989-90d4-44cf-bb56-539b3926d6a6" path="/var/lib/kubelet/pods/6024d989-90d4-44cf-bb56-539b3926d6a6/volumes" Oct 08 19:56:29 crc kubenswrapper[4750]: I1008 19:56:29.706958 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:56:29 crc kubenswrapper[4750]: I1008 19:56:29.707476 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:56:29 crc kubenswrapper[4750]: I1008 19:56:29.707540 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:56:29 crc kubenswrapper[4750]: I1008 19:56:29.708344 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"adabf6821006effe51695ae643fb86b44dd24b6d6a52aca3ce60c41928a0f63e"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:56:29 crc kubenswrapper[4750]: I1008 19:56:29.708447 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://adabf6821006effe51695ae643fb86b44dd24b6d6a52aca3ce60c41928a0f63e" gracePeriod=600 Oct 08 19:56:30 crc kubenswrapper[4750]: I1008 19:56:30.832627 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="adabf6821006effe51695ae643fb86b44dd24b6d6a52aca3ce60c41928a0f63e" exitCode=0 Oct 08 19:56:30 crc kubenswrapper[4750]: I1008 19:56:30.832716 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"adabf6821006effe51695ae643fb86b44dd24b6d6a52aca3ce60c41928a0f63e"} Oct 08 19:56:30 crc kubenswrapper[4750]: I1008 19:56:30.833056 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9"} Oct 08 19:56:30 crc kubenswrapper[4750]: I1008 19:56:30.833079 4750 scope.go:117] "RemoveContainer" containerID="fcf13ef2112fe840107d0203594fc62839dd38db4382291c73d378bd35584c76" Oct 08 19:56:30 crc kubenswrapper[4750]: I1008 19:56:30.836313 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w4h6x" event={"ID":"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d","Type":"ContainerStarted","Data":"792c40fe07c7f54b1d2f97afe714e3259c9c192ae8e79c4c74ec9bc817c49abe"} Oct 08 19:56:30 crc kubenswrapper[4750]: I1008 19:56:30.892892 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-w4h6x" podStartSLOduration=2.409856844 podStartE2EDuration="11.892859676s" podCreationTimestamp="2025-10-08 19:56:19 +0000 UTC" firstStartedPulling="2025-10-08 19:56:20.862857296 +0000 UTC m=+6336.775828309" lastFinishedPulling="2025-10-08 19:56:30.345860118 +0000 UTC m=+6346.258831141" observedRunningTime="2025-10-08 19:56:30.87402801 +0000 UTC m=+6346.786999083" watchObservedRunningTime="2025-10-08 19:56:30.892859676 +0000 UTC m=+6346.805830689" Oct 08 19:56:33 crc kubenswrapper[4750]: I1008 19:56:33.872515 4750 generic.go:334] "Generic (PLEG): container finished" podID="ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d" containerID="792c40fe07c7f54b1d2f97afe714e3259c9c192ae8e79c4c74ec9bc817c49abe" exitCode=0 Oct 08 19:56:33 crc kubenswrapper[4750]: I1008 19:56:33.872616 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w4h6x" event={"ID":"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d","Type":"ContainerDied","Data":"792c40fe07c7f54b1d2f97afe714e3259c9c192ae8e79c4c74ec9bc817c49abe"} Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.325157 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.461794 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-combined-ca-bundle\") pod \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.462337 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h948s\" (UniqueName: \"kubernetes.io/projected/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-kube-api-access-h948s\") pod \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.462626 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-config-data\") pod \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\" (UID: \"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d\") " Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.473289 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-kube-api-access-h948s" (OuterVolumeSpecName: "kube-api-access-h948s") pod "ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d" (UID: "ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d"). InnerVolumeSpecName "kube-api-access-h948s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.512240 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d" (UID: "ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.539221 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.565732 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.565775 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h948s\" (UniqueName: \"kubernetes.io/projected/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-kube-api-access-h948s\") on node \"crc\" DevicePath \"\"" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.580666 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-config-data" (OuterVolumeSpecName: "config-data") pod "ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d" (UID: "ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.668572 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.902061 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w4h6x" event={"ID":"ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d","Type":"ContainerDied","Data":"054568fc2b9766a2d58a218a7727cb44cc8b2867e240e7981936ab41f85cff5f"} Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.902608 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="054568fc2b9766a2d58a218a7727cb44cc8b2867e240e7981936ab41f85cff5f" Oct 08 19:56:35 crc kubenswrapper[4750]: I1008 19:56:35.902138 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w4h6x" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.400615 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5959c7d876-nv2km"] Oct 08 19:56:37 crc kubenswrapper[4750]: E1008 19:56:37.401704 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d" containerName="heat-db-sync" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.401724 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d" containerName="heat-db-sync" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.402020 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d" containerName="heat-db-sync" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.402943 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.409369 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.409637 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-qtq8p" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.415197 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.456412 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5959c7d876-nv2km"] Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.500578 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5466485586-4rpcn"] Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.502159 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.512151 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.516533 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-config-data\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.516679 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppksj\" (UniqueName: \"kubernetes.io/projected/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-kube-api-access-ppksj\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.516840 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-config-data-custom\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.517169 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-combined-ca-bundle\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.525607 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5466485586-4rpcn"] Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.599404 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b7fc4d7fc-bhdm6" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.612654 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6597fb9d78-62bqr"] Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.614477 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.616867 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.621333 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-combined-ca-bundle\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.621426 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-combined-ca-bundle\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.621510 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-config-data\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.621593 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppksj\" (UniqueName: \"kubernetes.io/projected/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-kube-api-access-ppksj\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.621628 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-config-data\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.621646 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-config-data-custom\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.621690 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-config-data-custom\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.621707 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz7nx\" (UniqueName: \"kubernetes.io/projected/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-kube-api-access-cz7nx\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.622330 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6597fb9d78-62bqr"] Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.643378 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-combined-ca-bundle\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.631079 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-config-data\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.664047 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-config-data-custom\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.671276 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppksj\" (UniqueName: \"kubernetes.io/projected/7cee3c3b-138c-4766-b9b4-e7d2b325be0d-kube-api-access-ppksj\") pod \"heat-engine-5959c7d876-nv2km\" (UID: \"7cee3c3b-138c-4766-b9b4-e7d2b325be0d\") " pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.708836 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f5fb9b649-5qqwr"] Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.709194 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f5fb9b649-5qqwr" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon-log" containerID="cri-o://87ed84e207c8ee0176b27ca029277a7c433695fb11ba21ab9909d99f324072bb" gracePeriod=30 Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.710129 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f5fb9b649-5qqwr" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" containerID="cri-o://ceecb5fabfb571fbbeded2c97cc653ed3571f4783fa7da8ab8dd63a1fddb003f" gracePeriod=30 Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.732909 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-combined-ca-bundle\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.733171 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-config-data\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.733230 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-config-data-custom\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.733270 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-config-data\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.733360 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz7nx\" (UniqueName: \"kubernetes.io/projected/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-kube-api-access-cz7nx\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.733477 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-config-data-custom\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.733690 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-combined-ca-bundle\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.733740 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pfhl\" (UniqueName: \"kubernetes.io/projected/8df319d5-d875-44fe-9f8c-8f53b6129570-kube-api-access-5pfhl\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.739522 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-config-data-custom\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.745514 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-config-data\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.745925 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-combined-ca-bundle\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.765499 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.766377 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz7nx\" (UniqueName: \"kubernetes.io/projected/f0314e1f-ae9b-40dd-8e2c-672a94f687ee-kube-api-access-cz7nx\") pod \"heat-api-5466485586-4rpcn\" (UID: \"f0314e1f-ae9b-40dd-8e2c-672a94f687ee\") " pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.838459 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-config-data\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.839095 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-config-data-custom\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.839184 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pfhl\" (UniqueName: \"kubernetes.io/projected/8df319d5-d875-44fe-9f8c-8f53b6129570-kube-api-access-5pfhl\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.839281 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-combined-ca-bundle\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.846320 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-config-data-custom\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.846666 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-config-data\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.847632 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df319d5-d875-44fe-9f8c-8f53b6129570-combined-ca-bundle\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.850500 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.858201 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pfhl\" (UniqueName: \"kubernetes.io/projected/8df319d5-d875-44fe-9f8c-8f53b6129570-kube-api-access-5pfhl\") pod \"heat-cfnapi-6597fb9d78-62bqr\" (UID: \"8df319d5-d875-44fe-9f8c-8f53b6129570\") " pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:37 crc kubenswrapper[4750]: I1008 19:56:37.894740 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.082442 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-284vj"] Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.106977 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-284vj"] Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.363358 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5959c7d876-nv2km"] Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.458532 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5466485586-4rpcn"] Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.601369 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6597fb9d78-62bqr"] Oct 08 19:56:38 crc kubenswrapper[4750]: W1008 19:56:38.605793 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8df319d5_d875_44fe_9f8c_8f53b6129570.slice/crio-85d82661bf86da87800eb67d28fc1316eba2c1e1a214d5cc5330674cca55a81e WatchSource:0}: Error finding container 85d82661bf86da87800eb67d28fc1316eba2c1e1a214d5cc5330674cca55a81e: Status 404 returned error can't find the container with id 85d82661bf86da87800eb67d28fc1316eba2c1e1a214d5cc5330674cca55a81e Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.748141 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4231f7e-ad8e-42f5-bbd9-b8d6df082070" path="/var/lib/kubelet/pods/d4231f7e-ad8e-42f5-bbd9-b8d6df082070/volumes" Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.943821 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5466485586-4rpcn" event={"ID":"f0314e1f-ae9b-40dd-8e2c-672a94f687ee","Type":"ContainerStarted","Data":"16fc61c3e4117b3ae1bf107a79c27a28b43e154c450744e303e70585cedb8ead"} Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.945904 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6597fb9d78-62bqr" event={"ID":"8df319d5-d875-44fe-9f8c-8f53b6129570","Type":"ContainerStarted","Data":"85d82661bf86da87800eb67d28fc1316eba2c1e1a214d5cc5330674cca55a81e"} Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.948192 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5959c7d876-nv2km" event={"ID":"7cee3c3b-138c-4766-b9b4-e7d2b325be0d","Type":"ContainerStarted","Data":"9ab9ba14f3a0a614cd225684a020bb873c2a9f29b26a6ff5eeb61f1766fd1061"} Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.948309 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5959c7d876-nv2km" event={"ID":"7cee3c3b-138c-4766-b9b4-e7d2b325be0d","Type":"ContainerStarted","Data":"a5dce160b72fdf251ed52bee41e3782a1a41a4313d60aab0cb2443bc41e8d787"} Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.948378 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:38 crc kubenswrapper[4750]: I1008 19:56:38.971332 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5959c7d876-nv2km" podStartSLOduration=1.971301527 podStartE2EDuration="1.971301527s" podCreationTimestamp="2025-10-08 19:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:56:38.969118693 +0000 UTC m=+6354.882089726" watchObservedRunningTime="2025-10-08 19:56:38.971301527 +0000 UTC m=+6354.884272540" Oct 08 19:56:39 crc kubenswrapper[4750]: I1008 19:56:39.031455 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-x9l8h"] Oct 08 19:56:39 crc kubenswrapper[4750]: I1008 19:56:39.039730 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-x9l8h"] Oct 08 19:56:40 crc kubenswrapper[4750]: I1008 19:56:40.845792 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23216996-77d2-40e7-a29c-b43247b1fb15" path="/var/lib/kubelet/pods/23216996-77d2-40e7-a29c-b43247b1fb15/volumes" Oct 08 19:56:40 crc kubenswrapper[4750]: I1008 19:56:40.973372 4750 generic.go:334] "Generic (PLEG): container finished" podID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerID="ceecb5fabfb571fbbeded2c97cc653ed3571f4783fa7da8ab8dd63a1fddb003f" exitCode=0 Oct 08 19:56:40 crc kubenswrapper[4750]: I1008 19:56:40.973444 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5fb9b649-5qqwr" event={"ID":"54ade5e3-9ffe-4b69-b060-734b5db093e8","Type":"ContainerDied","Data":"ceecb5fabfb571fbbeded2c97cc653ed3571f4783fa7da8ab8dd63a1fddb003f"} Oct 08 19:56:41 crc kubenswrapper[4750]: I1008 19:56:41.372538 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f5fb9b649-5qqwr" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.117:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8080: connect: connection refused" Oct 08 19:56:42 crc kubenswrapper[4750]: I1008 19:56:42.000982 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6597fb9d78-62bqr" event={"ID":"8df319d5-d875-44fe-9f8c-8f53b6129570","Type":"ContainerStarted","Data":"1e4a55595a6ef004459d7dd9aa04e10ad10eda562d6173315a54cfb6bdc1b94a"} Oct 08 19:56:42 crc kubenswrapper[4750]: I1008 19:56:42.001154 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:42 crc kubenswrapper[4750]: I1008 19:56:42.040474 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6597fb9d78-62bqr" podStartSLOduration=2.099932051 podStartE2EDuration="5.040445388s" podCreationTimestamp="2025-10-08 19:56:37 +0000 UTC" firstStartedPulling="2025-10-08 19:56:38.612803154 +0000 UTC m=+6354.525774167" lastFinishedPulling="2025-10-08 19:56:41.553316491 +0000 UTC m=+6357.466287504" observedRunningTime="2025-10-08 19:56:42.026133624 +0000 UTC m=+6357.939104647" watchObservedRunningTime="2025-10-08 19:56:42.040445388 +0000 UTC m=+6357.953416411" Oct 08 19:56:43 crc kubenswrapper[4750]: I1008 19:56:43.018199 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5466485586-4rpcn" event={"ID":"f0314e1f-ae9b-40dd-8e2c-672a94f687ee","Type":"ContainerStarted","Data":"e826a794f05c1d849a334c1e35e3895c19901bfdbc28db615275d5153f109339"} Oct 08 19:56:43 crc kubenswrapper[4750]: I1008 19:56:43.053782 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5466485586-4rpcn" podStartSLOduration=2.981999652 podStartE2EDuration="6.053758058s" podCreationTimestamp="2025-10-08 19:56:37 +0000 UTC" firstStartedPulling="2025-10-08 19:56:38.478129641 +0000 UTC m=+6354.391100654" lastFinishedPulling="2025-10-08 19:56:41.549888047 +0000 UTC m=+6357.462859060" observedRunningTime="2025-10-08 19:56:43.047692868 +0000 UTC m=+6358.960663901" watchObservedRunningTime="2025-10-08 19:56:43.053758058 +0000 UTC m=+6358.966729061" Oct 08 19:56:44 crc kubenswrapper[4750]: I1008 19:56:44.026854 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:49 crc kubenswrapper[4750]: I1008 19:56:49.273529 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6597fb9d78-62bqr" Oct 08 19:56:49 crc kubenswrapper[4750]: I1008 19:56:49.279422 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5466485586-4rpcn" Oct 08 19:56:51 crc kubenswrapper[4750]: I1008 19:56:51.371829 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f5fb9b649-5qqwr" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.117:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8080: connect: connection refused" Oct 08 19:56:53 crc kubenswrapper[4750]: I1008 19:56:53.779634 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jqx7z"] Oct 08 19:56:53 crc kubenswrapper[4750]: I1008 19:56:53.782974 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:53 crc kubenswrapper[4750]: I1008 19:56:53.799300 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqx7z"] Oct 08 19:56:53 crc kubenswrapper[4750]: I1008 19:56:53.912328 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-utilities\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:53 crc kubenswrapper[4750]: I1008 19:56:53.912578 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thxdz\" (UniqueName: \"kubernetes.io/projected/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-kube-api-access-thxdz\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:53 crc kubenswrapper[4750]: I1008 19:56:53.912697 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-catalog-content\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:54 crc kubenswrapper[4750]: I1008 19:56:54.015022 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-catalog-content\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:54 crc kubenswrapper[4750]: I1008 19:56:54.015116 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-utilities\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:54 crc kubenswrapper[4750]: I1008 19:56:54.015220 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thxdz\" (UniqueName: \"kubernetes.io/projected/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-kube-api-access-thxdz\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:54 crc kubenswrapper[4750]: I1008 19:56:54.016148 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-utilities\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:54 crc kubenswrapper[4750]: I1008 19:56:54.016484 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-catalog-content\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:54 crc kubenswrapper[4750]: I1008 19:56:54.049492 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thxdz\" (UniqueName: \"kubernetes.io/projected/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-kube-api-access-thxdz\") pod \"redhat-marketplace-jqx7z\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:54 crc kubenswrapper[4750]: I1008 19:56:54.125304 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:56:54 crc kubenswrapper[4750]: I1008 19:56:54.633764 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqx7z"] Oct 08 19:56:55 crc kubenswrapper[4750]: I1008 19:56:55.157432 4750 generic.go:334] "Generic (PLEG): container finished" podID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerID="561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33" exitCode=0 Oct 08 19:56:55 crc kubenswrapper[4750]: I1008 19:56:55.157497 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqx7z" event={"ID":"1ae2cfdc-157d-431d-8ad4-9c1afcec9085","Type":"ContainerDied","Data":"561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33"} Oct 08 19:56:55 crc kubenswrapper[4750]: I1008 19:56:55.157536 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqx7z" event={"ID":"1ae2cfdc-157d-431d-8ad4-9c1afcec9085","Type":"ContainerStarted","Data":"8c40d85b1c3a1c2b9f6cbc6d28a16e7f009fdcaabb9eb9c912388c69cad06e72"} Oct 08 19:56:57 crc kubenswrapper[4750]: I1008 19:56:57.045994 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-cdhgn"] Oct 08 19:56:57 crc kubenswrapper[4750]: I1008 19:56:57.061951 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-cdhgn"] Oct 08 19:56:57 crc kubenswrapper[4750]: I1008 19:56:57.183983 4750 generic.go:334] "Generic (PLEG): container finished" podID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerID="d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f" exitCode=0 Oct 08 19:56:57 crc kubenswrapper[4750]: I1008 19:56:57.184055 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqx7z" event={"ID":"1ae2cfdc-157d-431d-8ad4-9c1afcec9085","Type":"ContainerDied","Data":"d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f"} Oct 08 19:56:57 crc kubenswrapper[4750]: I1008 19:56:57.815296 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5959c7d876-nv2km" Oct 08 19:56:58 crc kubenswrapper[4750]: I1008 19:56:58.217939 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqx7z" event={"ID":"1ae2cfdc-157d-431d-8ad4-9c1afcec9085","Type":"ContainerStarted","Data":"8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91"} Oct 08 19:56:58 crc kubenswrapper[4750]: I1008 19:56:58.246046 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jqx7z" podStartSLOduration=2.740983614 podStartE2EDuration="5.246019914s" podCreationTimestamp="2025-10-08 19:56:53 +0000 UTC" firstStartedPulling="2025-10-08 19:56:55.164640081 +0000 UTC m=+6371.077611134" lastFinishedPulling="2025-10-08 19:56:57.669676421 +0000 UTC m=+6373.582647434" observedRunningTime="2025-10-08 19:56:58.236563771 +0000 UTC m=+6374.149534794" watchObservedRunningTime="2025-10-08 19:56:58.246019914 +0000 UTC m=+6374.158990927" Oct 08 19:56:58 crc kubenswrapper[4750]: I1008 19:56:58.745459 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60" path="/var/lib/kubelet/pods/56a3e7af-4ca3-48a3-9e5e-fd0a083d6a60/volumes" Oct 08 19:57:01 crc kubenswrapper[4750]: I1008 19:57:01.372342 4750 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f5fb9b649-5qqwr" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.117:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8080: connect: connection refused" Oct 08 19:57:01 crc kubenswrapper[4750]: I1008 19:57:01.372957 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:57:04 crc kubenswrapper[4750]: I1008 19:57:04.125633 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:57:04 crc kubenswrapper[4750]: I1008 19:57:04.126074 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:57:04 crc kubenswrapper[4750]: I1008 19:57:04.193292 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:57:04 crc kubenswrapper[4750]: I1008 19:57:04.330498 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:57:04 crc kubenswrapper[4750]: I1008 19:57:04.477723 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqx7z"] Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.305706 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jqx7z" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerName="registry-server" containerID="cri-o://8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91" gracePeriod=2 Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.403388 4750 scope.go:117] "RemoveContainer" containerID="3d0c56b5c34bf95f116aad313597dacc8d4dffe0962da310d7a7827c5a69cd6d" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.466088 4750 scope.go:117] "RemoveContainer" containerID="b1c02d86ae5972fbca6fb66bed397cead160ab86189a95ecb7adc6f6132119ac" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.574389 4750 scope.go:117] "RemoveContainer" containerID="b59449656c49771bb5a6ac16ab013ef0c4a8f47dad5d85c8cd08a99d59b1f166" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.640476 4750 scope.go:117] "RemoveContainer" containerID="072e06fd3a3dd64f96031b2b0b6bfffe55ecca2ef79371dcad9ce43cd3e4c581" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.713353 4750 scope.go:117] "RemoveContainer" containerID="0f734e41bc4c2c9759590ec271334e797ef292637184bc6f1b6393a5002293fd" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.772394 4750 scope.go:117] "RemoveContainer" containerID="b53cbbe43cd6f2ad27daa9736fa749a6da798ee14914e6fc0fa0fbd56a62bcba" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.804368 4750 scope.go:117] "RemoveContainer" containerID="a4e75abf5fbb6ba61cf0d3963a1254410baab6f8973ab1bd60d544fc60704364" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.822491 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.842052 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-catalog-content\") pod \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.842113 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-utilities\") pod \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.842214 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thxdz\" (UniqueName: \"kubernetes.io/projected/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-kube-api-access-thxdz\") pod \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\" (UID: \"1ae2cfdc-157d-431d-8ad4-9c1afcec9085\") " Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.844419 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-utilities" (OuterVolumeSpecName: "utilities") pod "1ae2cfdc-157d-431d-8ad4-9c1afcec9085" (UID: "1ae2cfdc-157d-431d-8ad4-9c1afcec9085"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.852335 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-kube-api-access-thxdz" (OuterVolumeSpecName: "kube-api-access-thxdz") pod "1ae2cfdc-157d-431d-8ad4-9c1afcec9085" (UID: "1ae2cfdc-157d-431d-8ad4-9c1afcec9085"). InnerVolumeSpecName "kube-api-access-thxdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.859157 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ae2cfdc-157d-431d-8ad4-9c1afcec9085" (UID: "1ae2cfdc-157d-431d-8ad4-9c1afcec9085"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.945312 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.945473 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:06 crc kubenswrapper[4750]: I1008 19:57:06.945528 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thxdz\" (UniqueName: \"kubernetes.io/projected/1ae2cfdc-157d-431d-8ad4-9c1afcec9085-kube-api-access-thxdz\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.327472 4750 generic.go:334] "Generic (PLEG): container finished" podID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerID="8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91" exitCode=0 Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.327628 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqx7z" event={"ID":"1ae2cfdc-157d-431d-8ad4-9c1afcec9085","Type":"ContainerDied","Data":"8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91"} Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.328078 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqx7z" event={"ID":"1ae2cfdc-157d-431d-8ad4-9c1afcec9085","Type":"ContainerDied","Data":"8c40d85b1c3a1c2b9f6cbc6d28a16e7f009fdcaabb9eb9c912388c69cad06e72"} Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.328137 4750 scope.go:117] "RemoveContainer" containerID="8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.327773 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqx7z" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.367597 4750 scope.go:117] "RemoveContainer" containerID="d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.373301 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqx7z"] Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.386808 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqx7z"] Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.398831 4750 scope.go:117] "RemoveContainer" containerID="561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.432862 4750 scope.go:117] "RemoveContainer" containerID="8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91" Oct 08 19:57:07 crc kubenswrapper[4750]: E1008 19:57:07.433494 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91\": container with ID starting with 8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91 not found: ID does not exist" containerID="8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.433605 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91"} err="failed to get container status \"8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91\": rpc error: code = NotFound desc = could not find container \"8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91\": container with ID starting with 8f20bd8a32ab80fc5e75eeb30923b34cd7c4c2c64f1bfd26e73f9958b8e43f91 not found: ID does not exist" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.433666 4750 scope.go:117] "RemoveContainer" containerID="d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f" Oct 08 19:57:07 crc kubenswrapper[4750]: E1008 19:57:07.434086 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f\": container with ID starting with d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f not found: ID does not exist" containerID="d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.434128 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f"} err="failed to get container status \"d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f\": rpc error: code = NotFound desc = could not find container \"d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f\": container with ID starting with d65612af69e8eee50f6d23feb1c23ecac0b8982fe6cd0fa0f74eec306e99d84f not found: ID does not exist" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.434173 4750 scope.go:117] "RemoveContainer" containerID="561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33" Oct 08 19:57:07 crc kubenswrapper[4750]: E1008 19:57:07.436930 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33\": container with ID starting with 561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33 not found: ID does not exist" containerID="561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33" Oct 08 19:57:07 crc kubenswrapper[4750]: I1008 19:57:07.437041 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33"} err="failed to get container status \"561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33\": rpc error: code = NotFound desc = could not find container \"561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33\": container with ID starting with 561bd1485dca369a56e3bf0cdb125b939eee7527da4493dec8113f6175612b33 not found: ID does not exist" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.355496 4750 generic.go:334] "Generic (PLEG): container finished" podID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerID="87ed84e207c8ee0176b27ca029277a7c433695fb11ba21ab9909d99f324072bb" exitCode=137 Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.355970 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5fb9b649-5qqwr" event={"ID":"54ade5e3-9ffe-4b69-b060-734b5db093e8","Type":"ContainerDied","Data":"87ed84e207c8ee0176b27ca029277a7c433695fb11ba21ab9909d99f324072bb"} Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.356014 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5fb9b649-5qqwr" event={"ID":"54ade5e3-9ffe-4b69-b060-734b5db093e8","Type":"ContainerDied","Data":"31fa5ca3756638d52ce5029fe8a1f6ea96eef0ce6712f40674d77909571123a0"} Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.356029 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31fa5ca3756638d52ce5029fe8a1f6ea96eef0ce6712f40674d77909571123a0" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.366534 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.499584 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54ade5e3-9ffe-4b69-b060-734b5db093e8-horizon-secret-key\") pod \"54ade5e3-9ffe-4b69-b060-734b5db093e8\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.499690 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-scripts\") pod \"54ade5e3-9ffe-4b69-b060-734b5db093e8\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.499954 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ade5e3-9ffe-4b69-b060-734b5db093e8-logs\") pod \"54ade5e3-9ffe-4b69-b060-734b5db093e8\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.500532 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ade5e3-9ffe-4b69-b060-734b5db093e8-logs" (OuterVolumeSpecName: "logs") pod "54ade5e3-9ffe-4b69-b060-734b5db093e8" (UID: "54ade5e3-9ffe-4b69-b060-734b5db093e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.500776 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-config-data\") pod \"54ade5e3-9ffe-4b69-b060-734b5db093e8\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.500907 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g64p5\" (UniqueName: \"kubernetes.io/projected/54ade5e3-9ffe-4b69-b060-734b5db093e8-kube-api-access-g64p5\") pod \"54ade5e3-9ffe-4b69-b060-734b5db093e8\" (UID: \"54ade5e3-9ffe-4b69-b060-734b5db093e8\") " Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.502105 4750 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ade5e3-9ffe-4b69-b060-734b5db093e8-logs\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.507435 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ade5e3-9ffe-4b69-b060-734b5db093e8-kube-api-access-g64p5" (OuterVolumeSpecName: "kube-api-access-g64p5") pod "54ade5e3-9ffe-4b69-b060-734b5db093e8" (UID: "54ade5e3-9ffe-4b69-b060-734b5db093e8"). InnerVolumeSpecName "kube-api-access-g64p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.507805 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ade5e3-9ffe-4b69-b060-734b5db093e8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "54ade5e3-9ffe-4b69-b060-734b5db093e8" (UID: "54ade5e3-9ffe-4b69-b060-734b5db093e8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.527232 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-scripts" (OuterVolumeSpecName: "scripts") pod "54ade5e3-9ffe-4b69-b060-734b5db093e8" (UID: "54ade5e3-9ffe-4b69-b060-734b5db093e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.537908 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-config-data" (OuterVolumeSpecName: "config-data") pod "54ade5e3-9ffe-4b69-b060-734b5db093e8" (UID: "54ade5e3-9ffe-4b69-b060-734b5db093e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.603999 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.604042 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g64p5\" (UniqueName: \"kubernetes.io/projected/54ade5e3-9ffe-4b69-b060-734b5db093e8-kube-api-access-g64p5\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.604053 4750 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54ade5e3-9ffe-4b69-b060-734b5db093e8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.604063 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54ade5e3-9ffe-4b69-b060-734b5db093e8-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:08 crc kubenswrapper[4750]: I1008 19:57:08.751059 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" path="/var/lib/kubelet/pods/1ae2cfdc-157d-431d-8ad4-9c1afcec9085/volumes" Oct 08 19:57:09 crc kubenswrapper[4750]: I1008 19:57:09.367280 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f5fb9b649-5qqwr" Oct 08 19:57:09 crc kubenswrapper[4750]: I1008 19:57:09.400746 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f5fb9b649-5qqwr"] Oct 08 19:57:09 crc kubenswrapper[4750]: I1008 19:57:09.411215 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f5fb9b649-5qqwr"] Oct 08 19:57:10 crc kubenswrapper[4750]: I1008 19:57:10.747049 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" path="/var/lib/kubelet/pods/54ade5e3-9ffe-4b69-b060-734b5db093e8/volumes" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.295232 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d"] Oct 08 19:57:15 crc kubenswrapper[4750]: E1008 19:57:15.296957 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.296982 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" Oct 08 19:57:15 crc kubenswrapper[4750]: E1008 19:57:15.297006 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon-log" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.297018 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon-log" Oct 08 19:57:15 crc kubenswrapper[4750]: E1008 19:57:15.297041 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerName="registry-server" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.297052 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerName="registry-server" Oct 08 19:57:15 crc kubenswrapper[4750]: E1008 19:57:15.297079 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerName="extract-content" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.297089 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerName="extract-content" Oct 08 19:57:15 crc kubenswrapper[4750]: E1008 19:57:15.297114 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerName="extract-utilities" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.297125 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerName="extract-utilities" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.297486 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae2cfdc-157d-431d-8ad4-9c1afcec9085" containerName="registry-server" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.297516 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon-log" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.297581 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ade5e3-9ffe-4b69-b060-734b5db093e8" containerName="horizon" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.300407 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.305474 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d"] Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.309196 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.481074 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.481486 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v2wg\" (UniqueName: \"kubernetes.io/projected/0ac80049-fa73-452c-975e-35d0b0a121de-kube-api-access-8v2wg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.481796 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.584915 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.585208 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.585252 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v2wg\" (UniqueName: \"kubernetes.io/projected/0ac80049-fa73-452c-975e-35d0b0a121de-kube-api-access-8v2wg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.586076 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.586290 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.611062 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v2wg\" (UniqueName: \"kubernetes.io/projected/0ac80049-fa73-452c-975e-35d0b0a121de-kube-api-access-8v2wg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:15 crc kubenswrapper[4750]: I1008 19:57:15.646799 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:16 crc kubenswrapper[4750]: I1008 19:57:16.228428 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d"] Oct 08 19:57:16 crc kubenswrapper[4750]: I1008 19:57:16.443145 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" event={"ID":"0ac80049-fa73-452c-975e-35d0b0a121de","Type":"ContainerStarted","Data":"58f0c1cf6ad0ea585e8dd921715025decd64a9c1d82e6c9bd48003dbd092e3e1"} Oct 08 19:57:17 crc kubenswrapper[4750]: I1008 19:57:17.458450 4750 generic.go:334] "Generic (PLEG): container finished" podID="0ac80049-fa73-452c-975e-35d0b0a121de" containerID="02d0d7fa1926a920207d485d9b5d09324ff5801b8700c13ca7cb848c863c24a5" exitCode=0 Oct 08 19:57:17 crc kubenswrapper[4750]: I1008 19:57:17.458586 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" event={"ID":"0ac80049-fa73-452c-975e-35d0b0a121de","Type":"ContainerDied","Data":"02d0d7fa1926a920207d485d9b5d09324ff5801b8700c13ca7cb848c863c24a5"} Oct 08 19:57:19 crc kubenswrapper[4750]: I1008 19:57:19.495929 4750 generic.go:334] "Generic (PLEG): container finished" podID="0ac80049-fa73-452c-975e-35d0b0a121de" containerID="63caba8e2d355738f9fd80fffbc2f28c16cd906c7c1ebead8fdad5b888b6e0f0" exitCode=0 Oct 08 19:57:19 crc kubenswrapper[4750]: I1008 19:57:19.495998 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" event={"ID":"0ac80049-fa73-452c-975e-35d0b0a121de","Type":"ContainerDied","Data":"63caba8e2d355738f9fd80fffbc2f28c16cd906c7c1ebead8fdad5b888b6e0f0"} Oct 08 19:57:20 crc kubenswrapper[4750]: I1008 19:57:20.509155 4750 generic.go:334] "Generic (PLEG): container finished" podID="0ac80049-fa73-452c-975e-35d0b0a121de" containerID="1f73e7a460662b78b3e7d9884f1ec48445e86ff575180ef77ffa07ff173714a7" exitCode=0 Oct 08 19:57:20 crc kubenswrapper[4750]: I1008 19:57:20.509358 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" event={"ID":"0ac80049-fa73-452c-975e-35d0b0a121de","Type":"ContainerDied","Data":"1f73e7a460662b78b3e7d9884f1ec48445e86ff575180ef77ffa07ff173714a7"} Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.072083 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.201647 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-bundle\") pod \"0ac80049-fa73-452c-975e-35d0b0a121de\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.201742 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v2wg\" (UniqueName: \"kubernetes.io/projected/0ac80049-fa73-452c-975e-35d0b0a121de-kube-api-access-8v2wg\") pod \"0ac80049-fa73-452c-975e-35d0b0a121de\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.201804 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-util\") pod \"0ac80049-fa73-452c-975e-35d0b0a121de\" (UID: \"0ac80049-fa73-452c-975e-35d0b0a121de\") " Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.204235 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-bundle" (OuterVolumeSpecName: "bundle") pod "0ac80049-fa73-452c-975e-35d0b0a121de" (UID: "0ac80049-fa73-452c-975e-35d0b0a121de"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.209476 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac80049-fa73-452c-975e-35d0b0a121de-kube-api-access-8v2wg" (OuterVolumeSpecName: "kube-api-access-8v2wg") pod "0ac80049-fa73-452c-975e-35d0b0a121de" (UID: "0ac80049-fa73-452c-975e-35d0b0a121de"). InnerVolumeSpecName "kube-api-access-8v2wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.218138 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-util" (OuterVolumeSpecName: "util") pod "0ac80049-fa73-452c-975e-35d0b0a121de" (UID: "0ac80049-fa73-452c-975e-35d0b0a121de"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.304574 4750 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.304612 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v2wg\" (UniqueName: \"kubernetes.io/projected/0ac80049-fa73-452c-975e-35d0b0a121de-kube-api-access-8v2wg\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.304628 4750 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ac80049-fa73-452c-975e-35d0b0a121de-util\") on node \"crc\" DevicePath \"\"" Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.533289 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" event={"ID":"0ac80049-fa73-452c-975e-35d0b0a121de","Type":"ContainerDied","Data":"58f0c1cf6ad0ea585e8dd921715025decd64a9c1d82e6c9bd48003dbd092e3e1"} Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.533354 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58f0c1cf6ad0ea585e8dd921715025decd64a9c1d82e6c9bd48003dbd092e3e1" Oct 08 19:57:22 crc kubenswrapper[4750]: I1008 19:57:22.533318 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.782540 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb"] Oct 08 19:57:33 crc kubenswrapper[4750]: E1008 19:57:33.783752 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac80049-fa73-452c-975e-35d0b0a121de" containerName="pull" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.783770 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac80049-fa73-452c-975e-35d0b0a121de" containerName="pull" Oct 08 19:57:33 crc kubenswrapper[4750]: E1008 19:57:33.783786 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac80049-fa73-452c-975e-35d0b0a121de" containerName="extract" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.783793 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac80049-fa73-452c-975e-35d0b0a121de" containerName="extract" Oct 08 19:57:33 crc kubenswrapper[4750]: E1008 19:57:33.783837 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac80049-fa73-452c-975e-35d0b0a121de" containerName="util" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.783845 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac80049-fa73-452c-975e-35d0b0a121de" containerName="util" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.784051 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac80049-fa73-452c-975e-35d0b0a121de" containerName="extract" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.784827 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.796266 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.796662 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-6bvcq" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.796802 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.837885 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb"] Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.902642 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj"] Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.905065 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.910904 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-cxhrk" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.911092 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.926639 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x"] Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.928320 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.965497 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj"] Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.993424 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj\" (UID: \"38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.993495 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw2qx\" (UniqueName: \"kubernetes.io/projected/32704cbb-ea75-4dc3-96b1-718b994fe335-kube-api-access-sw2qx\") pod \"obo-prometheus-operator-7c8cf85677-5xxjb\" (UID: \"32704cbb-ea75-4dc3-96b1-718b994fe335\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb" Oct 08 19:57:33 crc kubenswrapper[4750]: I1008 19:57:33.993523 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj\" (UID: \"38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.014677 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x"] Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.105649 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-6sxl4"] Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.163651 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj\" (UID: \"38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.164045 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc3a1db5-d8d5-4138-966c-93257a1f27f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x\" (UID: \"dc3a1db5-d8d5-4138-966c-93257a1f27f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.164393 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc3a1db5-d8d5-4138-966c-93257a1f27f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x\" (UID: \"dc3a1db5-d8d5-4138-966c-93257a1f27f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.164457 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj\" (UID: \"38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.164577 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw2qx\" (UniqueName: \"kubernetes.io/projected/32704cbb-ea75-4dc3-96b1-718b994fe335-kube-api-access-sw2qx\") pod \"obo-prometheus-operator-7c8cf85677-5xxjb\" (UID: \"32704cbb-ea75-4dc3-96b1-718b994fe335\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.164455 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.165241 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-6sxl4"] Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.172292 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-svs2x" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.172536 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.189183 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj\" (UID: \"38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.195193 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj\" (UID: \"38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.215460 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw2qx\" (UniqueName: \"kubernetes.io/projected/32704cbb-ea75-4dc3-96b1-718b994fe335-kube-api-access-sw2qx\") pod \"obo-prometheus-operator-7c8cf85677-5xxjb\" (UID: \"32704cbb-ea75-4dc3-96b1-718b994fe335\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.248936 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.270854 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc3a1db5-d8d5-4138-966c-93257a1f27f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x\" (UID: \"dc3a1db5-d8d5-4138-966c-93257a1f27f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.270931 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xzqh\" (UniqueName: \"kubernetes.io/projected/55701760-1c94-48e7-a3b8-42571d13ac31-kube-api-access-8xzqh\") pod \"observability-operator-cc5f78dfc-6sxl4\" (UID: \"55701760-1c94-48e7-a3b8-42571d13ac31\") " pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.271016 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/55701760-1c94-48e7-a3b8-42571d13ac31-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-6sxl4\" (UID: \"55701760-1c94-48e7-a3b8-42571d13ac31\") " pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.271042 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc3a1db5-d8d5-4138-966c-93257a1f27f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x\" (UID: \"dc3a1db5-d8d5-4138-966c-93257a1f27f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.282631 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc3a1db5-d8d5-4138-966c-93257a1f27f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x\" (UID: \"dc3a1db5-d8d5-4138-966c-93257a1f27f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.309257 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc3a1db5-d8d5-4138-966c-93257a1f27f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x\" (UID: \"dc3a1db5-d8d5-4138-966c-93257a1f27f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.410505 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xzqh\" (UniqueName: \"kubernetes.io/projected/55701760-1c94-48e7-a3b8-42571d13ac31-kube-api-access-8xzqh\") pod \"observability-operator-cc5f78dfc-6sxl4\" (UID: \"55701760-1c94-48e7-a3b8-42571d13ac31\") " pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.410797 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/55701760-1c94-48e7-a3b8-42571d13ac31-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-6sxl4\" (UID: \"55701760-1c94-48e7-a3b8-42571d13ac31\") " pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.428384 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-h6jpz"] Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.429874 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.430168 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.437459 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/55701760-1c94-48e7-a3b8-42571d13ac31-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-6sxl4\" (UID: \"55701760-1c94-48e7-a3b8-42571d13ac31\") " pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.437913 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-5tts5" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.478650 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-h6jpz"] Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.481999 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xzqh\" (UniqueName: \"kubernetes.io/projected/55701760-1c94-48e7-a3b8-42571d13ac31-kube-api-access-8xzqh\") pod \"observability-operator-cc5f78dfc-6sxl4\" (UID: \"55701760-1c94-48e7-a3b8-42571d13ac31\") " pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.518828 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/34a0461d-6888-438f-bca8-95ffbb37e14e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-h6jpz\" (UID: \"34a0461d-6888-438f-bca8-95ffbb37e14e\") " pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.518988 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlj5p\" (UniqueName: \"kubernetes.io/projected/34a0461d-6888-438f-bca8-95ffbb37e14e-kube-api-access-nlj5p\") pod \"perses-operator-54bc95c9fb-h6jpz\" (UID: \"34a0461d-6888-438f-bca8-95ffbb37e14e\") " pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.576246 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.623699 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/34a0461d-6888-438f-bca8-95ffbb37e14e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-h6jpz\" (UID: \"34a0461d-6888-438f-bca8-95ffbb37e14e\") " pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.623832 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlj5p\" (UniqueName: \"kubernetes.io/projected/34a0461d-6888-438f-bca8-95ffbb37e14e-kube-api-access-nlj5p\") pod \"perses-operator-54bc95c9fb-h6jpz\" (UID: \"34a0461d-6888-438f-bca8-95ffbb37e14e\") " pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.625171 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/34a0461d-6888-438f-bca8-95ffbb37e14e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-h6jpz\" (UID: \"34a0461d-6888-438f-bca8-95ffbb37e14e\") " pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.649984 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlj5p\" (UniqueName: \"kubernetes.io/projected/34a0461d-6888-438f-bca8-95ffbb37e14e-kube-api-access-nlj5p\") pod \"perses-operator-54bc95c9fb-h6jpz\" (UID: \"34a0461d-6888-438f-bca8-95ffbb37e14e\") " pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.728983 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:34 crc kubenswrapper[4750]: I1008 19:57:34.824668 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.239130 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x"] Oct 08 19:57:35 crc kubenswrapper[4750]: W1008 19:57:35.242046 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32704cbb_ea75_4dc3_96b1_718b994fe335.slice/crio-316fbfd699dbd7cfd9533bfd86dcff55ce76f3320cf369d1dafa78e3058d1fe0 WatchSource:0}: Error finding container 316fbfd699dbd7cfd9533bfd86dcff55ce76f3320cf369d1dafa78e3058d1fe0: Status 404 returned error can't find the container with id 316fbfd699dbd7cfd9533bfd86dcff55ce76f3320cf369d1dafa78e3058d1fe0 Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.251959 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb"] Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.262428 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj"] Oct 08 19:57:35 crc kubenswrapper[4750]: W1008 19:57:35.270723 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b5c4ed_86e9_4be1_8df3_be9cd23ffa4f.slice/crio-3b11a4f5c531c0765546f79952b9c4a2391afa88762690a8ab2e60333ca8bcd7 WatchSource:0}: Error finding container 3b11a4f5c531c0765546f79952b9c4a2391afa88762690a8ab2e60333ca8bcd7: Status 404 returned error can't find the container with id 3b11a4f5c531c0765546f79952b9c4a2391afa88762690a8ab2e60333ca8bcd7 Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.512589 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-6sxl4"] Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.607019 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-h6jpz"] Oct 08 19:57:35 crc kubenswrapper[4750]: W1008 19:57:35.616122 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a0461d_6888_438f_bca8_95ffbb37e14e.slice/crio-0b40bc302d696993f51f7d666bc585d513df68057f89e7e0f9d5de41bb1f4c7d WatchSource:0}: Error finding container 0b40bc302d696993f51f7d666bc585d513df68057f89e7e0f9d5de41bb1f4c7d: Status 404 returned error can't find the container with id 0b40bc302d696993f51f7d666bc585d513df68057f89e7e0f9d5de41bb1f4c7d Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.737129 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" event={"ID":"38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f","Type":"ContainerStarted","Data":"3b11a4f5c531c0765546f79952b9c4a2391afa88762690a8ab2e60333ca8bcd7"} Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.738800 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" event={"ID":"55701760-1c94-48e7-a3b8-42571d13ac31","Type":"ContainerStarted","Data":"79366cc6c56b305d544ab2cc31168580c0a31d89288a8bb89419ce5e1e40966e"} Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.742797 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" event={"ID":"dc3a1db5-d8d5-4138-966c-93257a1f27f7","Type":"ContainerStarted","Data":"d1bf12b43bfa7ffbf387a33932663c4486d638be2f6382b8fb9ecdec02ff606c"} Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.744822 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb" event={"ID":"32704cbb-ea75-4dc3-96b1-718b994fe335","Type":"ContainerStarted","Data":"316fbfd699dbd7cfd9533bfd86dcff55ce76f3320cf369d1dafa78e3058d1fe0"} Oct 08 19:57:35 crc kubenswrapper[4750]: I1008 19:57:35.746568 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" event={"ID":"34a0461d-6888-438f-bca8-95ffbb37e14e","Type":"ContainerStarted","Data":"0b40bc302d696993f51f7d666bc585d513df68057f89e7e0f9d5de41bb1f4c7d"} Oct 08 19:57:41 crc kubenswrapper[4750]: I1008 19:57:41.871920 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" event={"ID":"38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f","Type":"ContainerStarted","Data":"8ef7565284abb0e6a4016483eaf4aac28d0b4a44c457ac89bddb2cb6a7fb755c"} Oct 08 19:57:41 crc kubenswrapper[4750]: I1008 19:57:41.905612 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj" podStartSLOduration=2.90777464 podStartE2EDuration="8.905576416s" podCreationTimestamp="2025-10-08 19:57:33 +0000 UTC" firstStartedPulling="2025-10-08 19:57:35.276880255 +0000 UTC m=+6411.189851258" lastFinishedPulling="2025-10-08 19:57:41.274682021 +0000 UTC m=+6417.187653034" observedRunningTime="2025-10-08 19:57:41.897215499 +0000 UTC m=+6417.810186512" watchObservedRunningTime="2025-10-08 19:57:41.905576416 +0000 UTC m=+6417.818547449" Oct 08 19:57:44 crc kubenswrapper[4750]: I1008 19:57:44.039783 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6dlz8"] Oct 08 19:57:44 crc kubenswrapper[4750]: I1008 19:57:44.052659 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6dlz8"] Oct 08 19:57:44 crc kubenswrapper[4750]: I1008 19:57:44.773216 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30d5ba5-d856-47b4-8b0d-aeb153eee24d" path="/var/lib/kubelet/pods/e30d5ba5-d856-47b4-8b0d-aeb153eee24d/volumes" Oct 08 19:57:45 crc kubenswrapper[4750]: I1008 19:57:45.935901 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb" event={"ID":"32704cbb-ea75-4dc3-96b1-718b994fe335","Type":"ContainerStarted","Data":"09901d315609f3096155d229fe2dd372a334c8aa49b2e454c9b5df341767191d"} Oct 08 19:57:45 crc kubenswrapper[4750]: I1008 19:57:45.945281 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" event={"ID":"34a0461d-6888-438f-bca8-95ffbb37e14e","Type":"ContainerStarted","Data":"8bc019ef6e90689ec408c9fe55a78c6a103c263c7d2b36a0100abc08efb56809"} Oct 08 19:57:45 crc kubenswrapper[4750]: I1008 19:57:45.945407 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:45 crc kubenswrapper[4750]: I1008 19:57:45.949640 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" event={"ID":"55701760-1c94-48e7-a3b8-42571d13ac31","Type":"ContainerStarted","Data":"791fc399a3ce90ba93030d1f8c15623a7efa18f66a311a5b8d61e489cd7fbc59"} Oct 08 19:57:45 crc kubenswrapper[4750]: I1008 19:57:45.951171 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:45 crc kubenswrapper[4750]: I1008 19:57:45.961978 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" event={"ID":"dc3a1db5-d8d5-4138-966c-93257a1f27f7","Type":"ContainerStarted","Data":"eba340259d303f7629076f1305621f23431f4ffc558fb5f3c89c6560061a62fd"} Oct 08 19:57:45 crc kubenswrapper[4750]: I1008 19:57:45.970076 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-5xxjb" podStartSLOduration=6.914465895 podStartE2EDuration="12.970051511s" podCreationTimestamp="2025-10-08 19:57:33 +0000 UTC" firstStartedPulling="2025-10-08 19:57:35.249379125 +0000 UTC m=+6411.162350138" lastFinishedPulling="2025-10-08 19:57:41.304964741 +0000 UTC m=+6417.217935754" observedRunningTime="2025-10-08 19:57:45.955793478 +0000 UTC m=+6421.868764501" watchObservedRunningTime="2025-10-08 19:57:45.970051511 +0000 UTC m=+6421.883022524" Oct 08 19:57:45 crc kubenswrapper[4750]: I1008 19:57:45.999831 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" podStartSLOduration=2.3292640110000002 podStartE2EDuration="11.999798647s" podCreationTimestamp="2025-10-08 19:57:34 +0000 UTC" firstStartedPulling="2025-10-08 19:57:35.53807673 +0000 UTC m=+6411.451047743" lastFinishedPulling="2025-10-08 19:57:45.208611366 +0000 UTC m=+6421.121582379" observedRunningTime="2025-10-08 19:57:45.987618966 +0000 UTC m=+6421.900590009" watchObservedRunningTime="2025-10-08 19:57:45.999798647 +0000 UTC m=+6421.912769660" Oct 08 19:57:46 crc kubenswrapper[4750]: I1008 19:57:46.017147 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" podStartSLOduration=6.304281253 podStartE2EDuration="12.017126026s" podCreationTimestamp="2025-10-08 19:57:34 +0000 UTC" firstStartedPulling="2025-10-08 19:57:35.619855744 +0000 UTC m=+6411.532826757" lastFinishedPulling="2025-10-08 19:57:41.332700507 +0000 UTC m=+6417.245671530" observedRunningTime="2025-10-08 19:57:46.006623836 +0000 UTC m=+6421.919594849" watchObservedRunningTime="2025-10-08 19:57:46.017126026 +0000 UTC m=+6421.930097049" Oct 08 19:57:46 crc kubenswrapper[4750]: I1008 19:57:46.023027 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-6sxl4" Oct 08 19:57:46 crc kubenswrapper[4750]: I1008 19:57:46.048786 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x" podStartSLOduration=7.014805098 podStartE2EDuration="13.048761069s" podCreationTimestamp="2025-10-08 19:57:33 +0000 UTC" firstStartedPulling="2025-10-08 19:57:35.241891349 +0000 UTC m=+6411.154862362" lastFinishedPulling="2025-10-08 19:57:41.27584732 +0000 UTC m=+6417.188818333" observedRunningTime="2025-10-08 19:57:46.04070827 +0000 UTC m=+6421.953679293" watchObservedRunningTime="2025-10-08 19:57:46.048761069 +0000 UTC m=+6421.961732082" Oct 08 19:57:54 crc kubenswrapper[4750]: I1008 19:57:54.032632 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a9d3-account-create-jhghv"] Oct 08 19:57:54 crc kubenswrapper[4750]: I1008 19:57:54.041422 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a9d3-account-create-jhghv"] Oct 08 19:57:54 crc kubenswrapper[4750]: I1008 19:57:54.751944 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5252b4f6-3b9c-471f-b919-ac87889a95cb" path="/var/lib/kubelet/pods/5252b4f6-3b9c-471f-b919-ac87889a95cb/volumes" Oct 08 19:57:54 crc kubenswrapper[4750]: I1008 19:57:54.827482 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-h6jpz" Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.695095 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.695942 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" containerName="openstackclient" containerID="cri-o://f2c3e83659946de46e30c1fcf132ef19fcf9fd026c254bd1a157559c4becf3a5" gracePeriod=2 Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.709068 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.757436 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 19:57:57 crc kubenswrapper[4750]: E1008 19:57:57.758228 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" containerName="openstackclient" Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.758300 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" containerName="openstackclient" Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.758614 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" containerName="openstackclient" Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.759646 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.773887 4750 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" podUID="fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0" Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.794696 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.927624 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-openstack-config\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.928286 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-openstack-config-secret\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:57 crc kubenswrapper[4750]: I1008 19:57:57.928363 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvldm\" (UniqueName: \"kubernetes.io/projected/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-kube-api-access-bvldm\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.003056 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.007754 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.011912 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-6fd55" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.022770 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.032008 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-openstack-config\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.032100 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-openstack-config-secret\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.032142 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvldm\" (UniqueName: \"kubernetes.io/projected/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-kube-api-access-bvldm\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.046627 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-openstack-config\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.090900 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-openstack-config-secret\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.091647 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvldm\" (UniqueName: \"kubernetes.io/projected/fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0-kube-api-access-bvldm\") pod \"openstackclient\" (UID: \"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0\") " pod="openstack/openstackclient" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.113749 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.150344 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sszjn\" (UniqueName: \"kubernetes.io/projected/6959edb9-fcde-4267-a843-7aa0d1cdff7b-kube-api-access-sszjn\") pod \"kube-state-metrics-0\" (UID: \"6959edb9-fcde-4267-a843-7aa0d1cdff7b\") " pod="openstack/kube-state-metrics-0" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.254590 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sszjn\" (UniqueName: \"kubernetes.io/projected/6959edb9-fcde-4267-a843-7aa0d1cdff7b-kube-api-access-sszjn\") pod \"kube-state-metrics-0\" (UID: \"6959edb9-fcde-4267-a843-7aa0d1cdff7b\") " pod="openstack/kube-state-metrics-0" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.341205 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sszjn\" (UniqueName: \"kubernetes.io/projected/6959edb9-fcde-4267-a843-7aa0d1cdff7b-kube-api-access-sszjn\") pod \"kube-state-metrics-0\" (UID: \"6959edb9-fcde-4267-a843-7aa0d1cdff7b\") " pod="openstack/kube-state-metrics-0" Oct 08 19:57:58 crc kubenswrapper[4750]: I1008 19:57:58.354010 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 19:57:59 crc kubenswrapper[4750]: I1008 19:57:59.088295 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 19:57:59 crc kubenswrapper[4750]: I1008 19:57:59.171875 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 19:57:59 crc kubenswrapper[4750]: I1008 19:57:59.222397 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0","Type":"ContainerStarted","Data":"5d2a8e06933dd4ffb0e658f340bb09987136484ee37e993e2bb40e78831a2da6"} Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.071742 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.076499 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.088614 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-wh4dk" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.088896 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.089062 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.089780 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.099181 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.153442 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0ec2d588-f810-44a3-a8a9-cb384a2be42d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.153521 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0ec2d588-f810-44a3-a8a9-cb384a2be42d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.153629 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0ec2d588-f810-44a3-a8a9-cb384a2be42d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.153685 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0ec2d588-f810-44a3-a8a9-cb384a2be42d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.153804 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rbjw\" (UniqueName: \"kubernetes.io/projected/0ec2d588-f810-44a3-a8a9-cb384a2be42d-kube-api-access-7rbjw\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.153845 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0ec2d588-f810-44a3-a8a9-cb384a2be42d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.263880 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0","Type":"ContainerStarted","Data":"6e60d135626179d61cd4ca306291f1278e536aecc3c92408af1f0cbd9188fd4e"} Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.266370 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0ec2d588-f810-44a3-a8a9-cb384a2be42d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.266527 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rbjw\" (UniqueName: \"kubernetes.io/projected/0ec2d588-f810-44a3-a8a9-cb384a2be42d-kube-api-access-7rbjw\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.271214 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0ec2d588-f810-44a3-a8a9-cb384a2be42d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.272290 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6959edb9-fcde-4267-a843-7aa0d1cdff7b","Type":"ContainerStarted","Data":"df99570379cb96553c917d6820904274bd2f65b54a20e77ee8a240613d4c521e"} Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.272412 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6959edb9-fcde-4267-a843-7aa0d1cdff7b","Type":"ContainerStarted","Data":"c23aca32bfad89354bd2c1eb604004c04e4a6c498b0cfb065efb97b212cda314"} Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.274770 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0ec2d588-f810-44a3-a8a9-cb384a2be42d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.278378 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0ec2d588-f810-44a3-a8a9-cb384a2be42d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.278481 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0ec2d588-f810-44a3-a8a9-cb384a2be42d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.279259 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0ec2d588-f810-44a3-a8a9-cb384a2be42d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.286506 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.286487412 podStartE2EDuration="3.286487412s" podCreationTimestamp="2025-10-08 19:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 19:58:00.285356374 +0000 UTC m=+6436.198327397" watchObservedRunningTime="2025-10-08 19:58:00.286487412 +0000 UTC m=+6436.199458425" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.289295 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0ec2d588-f810-44a3-a8a9-cb384a2be42d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.289391 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.291782 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.292234 4750 generic.go:334] "Generic (PLEG): container finished" podID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" containerID="f2c3e83659946de46e30c1fcf132ef19fcf9fd026c254bd1a157559c4becf3a5" exitCode=137 Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.295624 4750 scope.go:117] "RemoveContainer" containerID="f2c3e83659946de46e30c1fcf132ef19fcf9fd026c254bd1a157559c4becf3a5" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.294046 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0ec2d588-f810-44a3-a8a9-cb384a2be42d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.294399 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0ec2d588-f810-44a3-a8a9-cb384a2be42d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.295921 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0ec2d588-f810-44a3-a8a9-cb384a2be42d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.316658 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rbjw\" (UniqueName: \"kubernetes.io/projected/0ec2d588-f810-44a3-a8a9-cb384a2be42d-kube-api-access-7rbjw\") pod \"alertmanager-metric-storage-0\" (UID: \"0ec2d588-f810-44a3-a8a9-cb384a2be42d\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.340277 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.818858398 podStartE2EDuration="3.340248622s" podCreationTimestamp="2025-10-08 19:57:57 +0000 UTC" firstStartedPulling="2025-10-08 19:57:59.227349298 +0000 UTC m=+6435.140320311" lastFinishedPulling="2025-10-08 19:57:59.748739522 +0000 UTC m=+6435.661710535" observedRunningTime="2025-10-08 19:58:00.325123607 +0000 UTC m=+6436.238094620" watchObservedRunningTime="2025-10-08 19:58:00.340248622 +0000 UTC m=+6436.253219635" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.382604 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98bq8\" (UniqueName: \"kubernetes.io/projected/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-kube-api-access-98bq8\") pod \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.382820 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config-secret\") pod \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.382991 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config\") pod \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\" (UID: \"a62829b0-9993-4ccd-bfea-fc3c9fe0aaee\") " Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.387781 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-kube-api-access-98bq8" (OuterVolumeSpecName: "kube-api-access-98bq8") pod "a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" (UID: "a62829b0-9993-4ccd-bfea-fc3c9fe0aaee"). InnerVolumeSpecName "kube-api-access-98bq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.429936 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" (UID: "a62829b0-9993-4ccd-bfea-fc3c9fe0aaee"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.486309 4750 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.486367 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98bq8\" (UniqueName: \"kubernetes.io/projected/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-kube-api-access-98bq8\") on node \"crc\" DevicePath \"\"" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.505862 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" (UID: "a62829b0-9993-4ccd-bfea-fc3c9fe0aaee"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.566481 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.592688 4750 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.752830 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62829b0-9993-4ccd-bfea-fc3c9fe0aaee" path="/var/lib/kubelet/pods/a62829b0-9993-4ccd-bfea-fc3c9fe0aaee/volumes" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.755277 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.760344 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.762431 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.762968 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.764183 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-q4tv9" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.764417 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.768885 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.769387 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.788309 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.909257 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/937a384a-f15c-4b65-868a-70d040c830d0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.909336 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-config\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.909361 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/937a384a-f15c-4b65-868a-70d040c830d0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.909404 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/937a384a-f15c-4b65-868a-70d040c830d0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.909518 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.909565 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.909709 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swch8\" (UniqueName: \"kubernetes.io/projected/937a384a-f15c-4b65-868a-70d040c830d0-kube-api-access-swch8\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:00 crc kubenswrapper[4750]: I1008 19:58:00.910082 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-80d5a29e-5739-46a3-98d5-a2f80b993b58\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-80d5a29e-5739-46a3-98d5-a2f80b993b58\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.012435 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-80d5a29e-5739-46a3-98d5-a2f80b993b58\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-80d5a29e-5739-46a3-98d5-a2f80b993b58\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.012572 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/937a384a-f15c-4b65-868a-70d040c830d0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.012623 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-config\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.012649 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/937a384a-f15c-4b65-868a-70d040c830d0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.012695 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/937a384a-f15c-4b65-868a-70d040c830d0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.012778 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.012811 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.012847 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swch8\" (UniqueName: \"kubernetes.io/projected/937a384a-f15c-4b65-868a-70d040c830d0-kube-api-access-swch8\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.014620 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/937a384a-f15c-4b65-868a-70d040c830d0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.020969 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/937a384a-f15c-4b65-868a-70d040c830d0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.021487 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-config\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.021882 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.023357 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/937a384a-f15c-4b65-868a-70d040c830d0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.024256 4750 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.024285 4750 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-80d5a29e-5739-46a3-98d5-a2f80b993b58\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-80d5a29e-5739-46a3-98d5-a2f80b993b58\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d5e1cc23328afec9dc48cca9b9dd13e61bc5dcc9b1c3e25113172de9f199770d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.033500 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/937a384a-f15c-4b65-868a-70d040c830d0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.036698 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swch8\" (UniqueName: \"kubernetes.io/projected/937a384a-f15c-4b65-868a-70d040c830d0-kube-api-access-swch8\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.083488 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-80d5a29e-5739-46a3-98d5-a2f80b993b58\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-80d5a29e-5739-46a3-98d5-a2f80b993b58\") pod \"prometheus-metric-storage-0\" (UID: \"937a384a-f15c-4b65-868a-70d040c830d0\") " pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.112798 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.153920 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 19:58:01 crc kubenswrapper[4750]: W1008 19:58:01.193327 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ec2d588_f810_44a3_a8a9_cb384a2be42d.slice/crio-42d20a9b73f253c72596e6188c78b19ed58cbcee37aec0720f0e44aad6bf18f0 WatchSource:0}: Error finding container 42d20a9b73f253c72596e6188c78b19ed58cbcee37aec0720f0e44aad6bf18f0: Status 404 returned error can't find the container with id 42d20a9b73f253c72596e6188c78b19ed58cbcee37aec0720f0e44aad6bf18f0 Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.322251 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0ec2d588-f810-44a3-a8a9-cb384a2be42d","Type":"ContainerStarted","Data":"42d20a9b73f253c72596e6188c78b19ed58cbcee37aec0720f0e44aad6bf18f0"} Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.325479 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 19:58:01 crc kubenswrapper[4750]: I1008 19:58:01.721334 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 19:58:02 crc kubenswrapper[4750]: I1008 19:58:02.075745 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mtvkc"] Oct 08 19:58:02 crc kubenswrapper[4750]: I1008 19:58:02.082521 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mtvkc"] Oct 08 19:58:02 crc kubenswrapper[4750]: I1008 19:58:02.354357 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"937a384a-f15c-4b65-868a-70d040c830d0","Type":"ContainerStarted","Data":"cf5d71fcdb911d458d6130f44cb07659181411d5f733b7b355a6a2a62cbf02ad"} Oct 08 19:58:02 crc kubenswrapper[4750]: I1008 19:58:02.763870 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aeb13a0-48b6-4a84-998d-3fb8a64d84ac" path="/var/lib/kubelet/pods/3aeb13a0-48b6-4a84-998d-3fb8a64d84ac/volumes" Oct 08 19:58:06 crc kubenswrapper[4750]: I1008 19:58:06.960697 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-524gz"] Oct 08 19:58:06 crc kubenswrapper[4750]: I1008 19:58:06.963719 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:06 crc kubenswrapper[4750]: I1008 19:58:06.978093 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-524gz"] Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.002513 4750 scope.go:117] "RemoveContainer" containerID="666bcd48683c2fa3cc15aee36c899e12b20a9c3914b7cce1af304b608a27ad58" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.054879 4750 scope.go:117] "RemoveContainer" containerID="7f0511dc6da9cfa214336414c466dbad8936913782bd5527a4d15e1f0bd855b4" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.090998 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-catalog-content\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.091428 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4chmd\" (UniqueName: \"kubernetes.io/projected/506293fa-b6e2-45fd-9610-647dd22b2d2e-kube-api-access-4chmd\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.091619 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-utilities\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.193802 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-catalog-content\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.194186 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chmd\" (UniqueName: \"kubernetes.io/projected/506293fa-b6e2-45fd-9610-647dd22b2d2e-kube-api-access-4chmd\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.194683 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-catalog-content\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.195152 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-utilities\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.194716 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-utilities\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.216940 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chmd\" (UniqueName: \"kubernetes.io/projected/506293fa-b6e2-45fd-9610-647dd22b2d2e-kube-api-access-4chmd\") pod \"community-operators-524gz\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.302500 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.766946 4750 scope.go:117] "RemoveContainer" containerID="1c2c66d67b4aa603efba88fbdf492bf052470c6c732a4ee7a4806bdcfb5c2078" Oct 08 19:58:07 crc kubenswrapper[4750]: I1008 19:58:07.833561 4750 scope.go:117] "RemoveContainer" containerID="12a691284ef8f077cacb5c421b935acefa9df0db069e0f0c357e73a4f9f185f2" Oct 08 19:58:08 crc kubenswrapper[4750]: I1008 19:58:08.095658 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-524gz"] Oct 08 19:58:08 crc kubenswrapper[4750]: I1008 19:58:08.362490 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 19:58:08 crc kubenswrapper[4750]: I1008 19:58:08.450126 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"937a384a-f15c-4b65-868a-70d040c830d0","Type":"ContainerStarted","Data":"bc8a7d0193b844f16492c99741b49daa8364337578a43763259931008456e7d4"} Oct 08 19:58:08 crc kubenswrapper[4750]: I1008 19:58:08.454864 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524gz" event={"ID":"506293fa-b6e2-45fd-9610-647dd22b2d2e","Type":"ContainerStarted","Data":"639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71"} Oct 08 19:58:08 crc kubenswrapper[4750]: I1008 19:58:08.454935 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524gz" event={"ID":"506293fa-b6e2-45fd-9610-647dd22b2d2e","Type":"ContainerStarted","Data":"0e329ddcc852932fd4ffc3fdc2272cdc11ba13fd7fb39ed17e5bdd1a37fd0c72"} Oct 08 19:58:09 crc kubenswrapper[4750]: I1008 19:58:09.479237 4750 generic.go:334] "Generic (PLEG): container finished" podID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerID="639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71" exitCode=0 Oct 08 19:58:09 crc kubenswrapper[4750]: I1008 19:58:09.479358 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524gz" event={"ID":"506293fa-b6e2-45fd-9610-647dd22b2d2e","Type":"ContainerDied","Data":"639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71"} Oct 08 19:58:10 crc kubenswrapper[4750]: I1008 19:58:10.508356 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0ec2d588-f810-44a3-a8a9-cb384a2be42d","Type":"ContainerStarted","Data":"24bcb29503de131b5926bbabcec868e94e90427e9a53e31dd1616d06e914f14b"} Oct 08 19:58:10 crc kubenswrapper[4750]: I1008 19:58:10.514674 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524gz" event={"ID":"506293fa-b6e2-45fd-9610-647dd22b2d2e","Type":"ContainerStarted","Data":"04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51"} Oct 08 19:58:12 crc kubenswrapper[4750]: I1008 19:58:12.545748 4750 generic.go:334] "Generic (PLEG): container finished" podID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerID="04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51" exitCode=0 Oct 08 19:58:12 crc kubenswrapper[4750]: I1008 19:58:12.545995 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524gz" event={"ID":"506293fa-b6e2-45fd-9610-647dd22b2d2e","Type":"ContainerDied","Data":"04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51"} Oct 08 19:58:13 crc kubenswrapper[4750]: I1008 19:58:13.563496 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524gz" event={"ID":"506293fa-b6e2-45fd-9610-647dd22b2d2e","Type":"ContainerStarted","Data":"0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7"} Oct 08 19:58:13 crc kubenswrapper[4750]: I1008 19:58:13.594618 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-524gz" podStartSLOduration=2.829585362 podStartE2EDuration="7.594567515s" podCreationTimestamp="2025-10-08 19:58:06 +0000 UTC" firstStartedPulling="2025-10-08 19:58:08.459180365 +0000 UTC m=+6444.372151378" lastFinishedPulling="2025-10-08 19:58:13.224162508 +0000 UTC m=+6449.137133531" observedRunningTime="2025-10-08 19:58:13.585587143 +0000 UTC m=+6449.498558176" watchObservedRunningTime="2025-10-08 19:58:13.594567515 +0000 UTC m=+6449.507538538" Oct 08 19:58:17 crc kubenswrapper[4750]: I1008 19:58:17.304444 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:17 crc kubenswrapper[4750]: I1008 19:58:17.305315 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:17 crc kubenswrapper[4750]: I1008 19:58:17.368409 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:17 crc kubenswrapper[4750]: I1008 19:58:17.652417 4750 generic.go:334] "Generic (PLEG): container finished" podID="937a384a-f15c-4b65-868a-70d040c830d0" containerID="bc8a7d0193b844f16492c99741b49daa8364337578a43763259931008456e7d4" exitCode=0 Oct 08 19:58:17 crc kubenswrapper[4750]: I1008 19:58:17.652580 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"937a384a-f15c-4b65-868a-70d040c830d0","Type":"ContainerDied","Data":"bc8a7d0193b844f16492c99741b49daa8364337578a43763259931008456e7d4"} Oct 08 19:58:20 crc kubenswrapper[4750]: I1008 19:58:20.699155 4750 generic.go:334] "Generic (PLEG): container finished" podID="0ec2d588-f810-44a3-a8a9-cb384a2be42d" containerID="24bcb29503de131b5926bbabcec868e94e90427e9a53e31dd1616d06e914f14b" exitCode=0 Oct 08 19:58:20 crc kubenswrapper[4750]: I1008 19:58:20.699272 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0ec2d588-f810-44a3-a8a9-cb384a2be42d","Type":"ContainerDied","Data":"24bcb29503de131b5926bbabcec868e94e90427e9a53e31dd1616d06e914f14b"} Oct 08 19:58:23 crc kubenswrapper[4750]: I1008 19:58:23.778178 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"937a384a-f15c-4b65-868a-70d040c830d0","Type":"ContainerStarted","Data":"b5014d868bfab249c810a2d99f93616aefdb34840216462bb8595d6411835c38"} Oct 08 19:58:25 crc kubenswrapper[4750]: I1008 19:58:25.806223 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0ec2d588-f810-44a3-a8a9-cb384a2be42d","Type":"ContainerStarted","Data":"aeeee2877ad990ea12468edfbe5b1a6aac6bdec10ca1f6c16e61b893c211a1e4"} Oct 08 19:58:27 crc kubenswrapper[4750]: I1008 19:58:27.565968 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:27 crc kubenswrapper[4750]: I1008 19:58:27.632298 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-524gz"] Oct 08 19:58:27 crc kubenswrapper[4750]: I1008 19:58:27.832830 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-524gz" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerName="registry-server" containerID="cri-o://0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7" gracePeriod=2 Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.863086 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"937a384a-f15c-4b65-868a-70d040c830d0","Type":"ContainerStarted","Data":"a151914e0523304313b307578fc1e62d0582624c3026fe49de7752ca9a6be9a3"} Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.866463 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.866835 4750 generic.go:334] "Generic (PLEG): container finished" podID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerID="0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7" exitCode=0 Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.866878 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524gz" event={"ID":"506293fa-b6e2-45fd-9610-647dd22b2d2e","Type":"ContainerDied","Data":"0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7"} Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.866950 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-524gz" event={"ID":"506293fa-b6e2-45fd-9610-647dd22b2d2e","Type":"ContainerDied","Data":"0e329ddcc852932fd4ffc3fdc2272cdc11ba13fd7fb39ed17e5bdd1a37fd0c72"} Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.866983 4750 scope.go:117] "RemoveContainer" containerID="0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7" Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.901676 4750 scope.go:117] "RemoveContainer" containerID="04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51" Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.917820 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4chmd\" (UniqueName: \"kubernetes.io/projected/506293fa-b6e2-45fd-9610-647dd22b2d2e-kube-api-access-4chmd\") pod \"506293fa-b6e2-45fd-9610-647dd22b2d2e\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.918511 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-utilities\") pod \"506293fa-b6e2-45fd-9610-647dd22b2d2e\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.918764 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-catalog-content\") pod \"506293fa-b6e2-45fd-9610-647dd22b2d2e\" (UID: \"506293fa-b6e2-45fd-9610-647dd22b2d2e\") " Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.920417 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-utilities" (OuterVolumeSpecName: "utilities") pod "506293fa-b6e2-45fd-9610-647dd22b2d2e" (UID: "506293fa-b6e2-45fd-9610-647dd22b2d2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.926922 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506293fa-b6e2-45fd-9610-647dd22b2d2e-kube-api-access-4chmd" (OuterVolumeSpecName: "kube-api-access-4chmd") pod "506293fa-b6e2-45fd-9610-647dd22b2d2e" (UID: "506293fa-b6e2-45fd-9610-647dd22b2d2e"). InnerVolumeSpecName "kube-api-access-4chmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.938139 4750 scope.go:117] "RemoveContainer" containerID="639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71" Oct 08 19:58:28 crc kubenswrapper[4750]: I1008 19:58:28.975881 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "506293fa-b6e2-45fd-9610-647dd22b2d2e" (UID: "506293fa-b6e2-45fd-9610-647dd22b2d2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.022224 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.022313 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4chmd\" (UniqueName: \"kubernetes.io/projected/506293fa-b6e2-45fd-9610-647dd22b2d2e-kube-api-access-4chmd\") on node \"crc\" DevicePath \"\"" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.022334 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506293fa-b6e2-45fd-9610-647dd22b2d2e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.035733 4750 scope.go:117] "RemoveContainer" containerID="0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7" Oct 08 19:58:29 crc kubenswrapper[4750]: E1008 19:58:29.036442 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7\": container with ID starting with 0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7 not found: ID does not exist" containerID="0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.036481 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7"} err="failed to get container status \"0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7\": rpc error: code = NotFound desc = could not find container \"0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7\": container with ID starting with 0789a196909a8860386e21507e916ccda3e4c972948d4c466d2d9f1edd35d4d7 not found: ID does not exist" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.036508 4750 scope.go:117] "RemoveContainer" containerID="04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51" Oct 08 19:58:29 crc kubenswrapper[4750]: E1008 19:58:29.036986 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51\": container with ID starting with 04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51 not found: ID does not exist" containerID="04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.037079 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51"} err="failed to get container status \"04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51\": rpc error: code = NotFound desc = could not find container \"04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51\": container with ID starting with 04ad6f9a4e0587d0f53da037f2e8c253d11afdca9cffab1bc2eb923875e66b51 not found: ID does not exist" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.037154 4750 scope.go:117] "RemoveContainer" containerID="639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71" Oct 08 19:58:29 crc kubenswrapper[4750]: E1008 19:58:29.037799 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71\": container with ID starting with 639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71 not found: ID does not exist" containerID="639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.037830 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71"} err="failed to get container status \"639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71\": rpc error: code = NotFound desc = could not find container \"639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71\": container with ID starting with 639bca38116e422921455db319e1f3ec704426ff0627f5c06624ba2e8024cc71 not found: ID does not exist" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.881368 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-524gz" Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.925627 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-524gz"] Oct 08 19:58:29 crc kubenswrapper[4750]: I1008 19:58:29.933928 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-524gz"] Oct 08 19:58:30 crc kubenswrapper[4750]: I1008 19:58:30.747904 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" path="/var/lib/kubelet/pods/506293fa-b6e2-45fd-9610-647dd22b2d2e/volumes" Oct 08 19:58:30 crc kubenswrapper[4750]: I1008 19:58:30.899762 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0ec2d588-f810-44a3-a8a9-cb384a2be42d","Type":"ContainerStarted","Data":"2f6dc64b963f8058d923e35b80df8cf428183acb85045c4191cf511592b38ff3"} Oct 08 19:58:30 crc kubenswrapper[4750]: I1008 19:58:30.934591 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.36402549 podStartE2EDuration="30.934565477s" podCreationTimestamp="2025-10-08 19:58:00 +0000 UTC" firstStartedPulling="2025-10-08 19:58:01.206950673 +0000 UTC m=+6437.119921686" lastFinishedPulling="2025-10-08 19:58:24.77749066 +0000 UTC m=+6460.690461673" observedRunningTime="2025-10-08 19:58:30.926239561 +0000 UTC m=+6466.839210634" watchObservedRunningTime="2025-10-08 19:58:30.934565477 +0000 UTC m=+6466.847536490" Oct 08 19:58:31 crc kubenswrapper[4750]: I1008 19:58:31.930286 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:31 crc kubenswrapper[4750]: I1008 19:58:31.938447 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 08 19:58:32 crc kubenswrapper[4750]: I1008 19:58:32.944682 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"937a384a-f15c-4b65-868a-70d040c830d0","Type":"ContainerStarted","Data":"569a155a8372134d0dc303517f3d06f53dbcb525d47f49554ad33878bb5fa9fb"} Oct 08 19:58:32 crc kubenswrapper[4750]: I1008 19:58:32.985006 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.052701924 podStartE2EDuration="33.984965843s" podCreationTimestamp="2025-10-08 19:57:59 +0000 UTC" firstStartedPulling="2025-10-08 19:58:01.737241958 +0000 UTC m=+6437.650212971" lastFinishedPulling="2025-10-08 19:58:31.669505877 +0000 UTC m=+6467.582476890" observedRunningTime="2025-10-08 19:58:32.97428616 +0000 UTC m=+6468.887257253" watchObservedRunningTime="2025-10-08 19:58:32.984965843 +0000 UTC m=+6468.897936886" Oct 08 19:58:36 crc kubenswrapper[4750]: I1008 19:58:36.113138 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.715214 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:58:37 crc kubenswrapper[4750]: E1008 19:58:37.716113 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerName="registry-server" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.716131 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerName="registry-server" Oct 08 19:58:37 crc kubenswrapper[4750]: E1008 19:58:37.716152 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerName="extract-content" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.716160 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerName="extract-content" Oct 08 19:58:37 crc kubenswrapper[4750]: E1008 19:58:37.716194 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerName="extract-utilities" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.716204 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerName="extract-utilities" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.716491 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="506293fa-b6e2-45fd-9610-647dd22b2d2e" containerName="registry-server" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.719351 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.722066 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.722741 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.730580 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.783257 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.783312 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.783427 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-run-httpd\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.783545 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-config-data\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.783853 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-log-httpd\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.783915 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kkz9\" (UniqueName: \"kubernetes.io/projected/69935543-c19c-4c1f-9f4c-d17ac44f6597-kube-api-access-6kkz9\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.783953 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-scripts\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.886922 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kkz9\" (UniqueName: \"kubernetes.io/projected/69935543-c19c-4c1f-9f4c-d17ac44f6597-kube-api-access-6kkz9\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.887003 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-scripts\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.887146 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.887183 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.887244 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-run-httpd\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.887286 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-config-data\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.887467 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-log-httpd\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.888134 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-log-httpd\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.894928 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-run-httpd\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.907710 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.908146 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.908176 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-scripts\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.923487 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kkz9\" (UniqueName: \"kubernetes.io/projected/69935543-c19c-4c1f-9f4c-d17ac44f6597-kube-api-access-6kkz9\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:37 crc kubenswrapper[4750]: I1008 19:58:37.925758 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-config-data\") pod \"ceilometer-0\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " pod="openstack/ceilometer-0" Oct 08 19:58:38 crc kubenswrapper[4750]: I1008 19:58:38.045543 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 19:58:38 crc kubenswrapper[4750]: W1008 19:58:38.627409 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69935543_c19c_4c1f_9f4c_d17ac44f6597.slice/crio-c1f09a2d43711ae1007ad304a62dc0ca37fc35f71e8d0b7da25a755149529282 WatchSource:0}: Error finding container c1f09a2d43711ae1007ad304a62dc0ca37fc35f71e8d0b7da25a755149529282: Status 404 returned error can't find the container with id c1f09a2d43711ae1007ad304a62dc0ca37fc35f71e8d0b7da25a755149529282 Oct 08 19:58:38 crc kubenswrapper[4750]: I1008 19:58:38.643397 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:58:39 crc kubenswrapper[4750]: I1008 19:58:39.014081 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerStarted","Data":"c1f09a2d43711ae1007ad304a62dc0ca37fc35f71e8d0b7da25a755149529282"} Oct 08 19:58:40 crc kubenswrapper[4750]: I1008 19:58:40.034408 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerStarted","Data":"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e"} Oct 08 19:58:41 crc kubenswrapper[4750]: I1008 19:58:41.048767 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerStarted","Data":"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885"} Oct 08 19:58:42 crc kubenswrapper[4750]: I1008 19:58:42.063002 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerStarted","Data":"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0"} Oct 08 19:58:43 crc kubenswrapper[4750]: I1008 19:58:43.077438 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerStarted","Data":"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4"} Oct 08 19:58:43 crc kubenswrapper[4750]: I1008 19:58:43.077985 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 19:58:43 crc kubenswrapper[4750]: I1008 19:58:43.107694 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.051172322 podStartE2EDuration="6.107664569s" podCreationTimestamp="2025-10-08 19:58:37 +0000 UTC" firstStartedPulling="2025-10-08 19:58:38.633045763 +0000 UTC m=+6474.546016776" lastFinishedPulling="2025-10-08 19:58:42.68953801 +0000 UTC m=+6478.602509023" observedRunningTime="2025-10-08 19:58:43.104695076 +0000 UTC m=+6479.017666099" watchObservedRunningTime="2025-10-08 19:58:43.107664569 +0000 UTC m=+6479.020635592" Oct 08 19:58:46 crc kubenswrapper[4750]: I1008 19:58:46.123377 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:46 crc kubenswrapper[4750]: I1008 19:58:46.130714 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:47 crc kubenswrapper[4750]: I1008 19:58:47.143727 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 08 19:58:49 crc kubenswrapper[4750]: I1008 19:58:49.172351 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-nq2b9"] Oct 08 19:58:49 crc kubenswrapper[4750]: I1008 19:58:49.174635 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nq2b9" Oct 08 19:58:49 crc kubenswrapper[4750]: I1008 19:58:49.197180 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-nq2b9"] Oct 08 19:58:49 crc kubenswrapper[4750]: I1008 19:58:49.242791 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nnf2\" (UniqueName: \"kubernetes.io/projected/b7ff6ccf-13de-41ac-af06-668eb1165729-kube-api-access-7nnf2\") pod \"aodh-db-create-nq2b9\" (UID: \"b7ff6ccf-13de-41ac-af06-668eb1165729\") " pod="openstack/aodh-db-create-nq2b9" Oct 08 19:58:49 crc kubenswrapper[4750]: I1008 19:58:49.345515 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nnf2\" (UniqueName: \"kubernetes.io/projected/b7ff6ccf-13de-41ac-af06-668eb1165729-kube-api-access-7nnf2\") pod \"aodh-db-create-nq2b9\" (UID: \"b7ff6ccf-13de-41ac-af06-668eb1165729\") " pod="openstack/aodh-db-create-nq2b9" Oct 08 19:58:49 crc kubenswrapper[4750]: I1008 19:58:49.376065 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nnf2\" (UniqueName: \"kubernetes.io/projected/b7ff6ccf-13de-41ac-af06-668eb1165729-kube-api-access-7nnf2\") pod \"aodh-db-create-nq2b9\" (UID: \"b7ff6ccf-13de-41ac-af06-668eb1165729\") " pod="openstack/aodh-db-create-nq2b9" Oct 08 19:58:49 crc kubenswrapper[4750]: I1008 19:58:49.552296 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nq2b9" Oct 08 19:58:50 crc kubenswrapper[4750]: I1008 19:58:50.143136 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-nq2b9"] Oct 08 19:58:50 crc kubenswrapper[4750]: I1008 19:58:50.179715 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nq2b9" event={"ID":"b7ff6ccf-13de-41ac-af06-668eb1165729","Type":"ContainerStarted","Data":"b71359769ca939f37fbf2a17fb570c15f83c928936166087a0c4ea7b5c96998d"} Oct 08 19:58:51 crc kubenswrapper[4750]: I1008 19:58:51.195615 4750 generic.go:334] "Generic (PLEG): container finished" podID="b7ff6ccf-13de-41ac-af06-668eb1165729" containerID="b81afe9219ebe3a68379d648c7b046be7468f4c8e9b6852c620e3700c2a26e49" exitCode=0 Oct 08 19:58:51 crc kubenswrapper[4750]: I1008 19:58:51.196738 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nq2b9" event={"ID":"b7ff6ccf-13de-41ac-af06-668eb1165729","Type":"ContainerDied","Data":"b81afe9219ebe3a68379d648c7b046be7468f4c8e9b6852c620e3700c2a26e49"} Oct 08 19:58:52 crc kubenswrapper[4750]: I1008 19:58:52.678335 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nq2b9" Oct 08 19:58:52 crc kubenswrapper[4750]: I1008 19:58:52.734225 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nnf2\" (UniqueName: \"kubernetes.io/projected/b7ff6ccf-13de-41ac-af06-668eb1165729-kube-api-access-7nnf2\") pod \"b7ff6ccf-13de-41ac-af06-668eb1165729\" (UID: \"b7ff6ccf-13de-41ac-af06-668eb1165729\") " Oct 08 19:58:52 crc kubenswrapper[4750]: I1008 19:58:52.741321 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ff6ccf-13de-41ac-af06-668eb1165729-kube-api-access-7nnf2" (OuterVolumeSpecName: "kube-api-access-7nnf2") pod "b7ff6ccf-13de-41ac-af06-668eb1165729" (UID: "b7ff6ccf-13de-41ac-af06-668eb1165729"). InnerVolumeSpecName "kube-api-access-7nnf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:58:52 crc kubenswrapper[4750]: I1008 19:58:52.838916 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nnf2\" (UniqueName: \"kubernetes.io/projected/b7ff6ccf-13de-41ac-af06-668eb1165729-kube-api-access-7nnf2\") on node \"crc\" DevicePath \"\"" Oct 08 19:58:53 crc kubenswrapper[4750]: I1008 19:58:53.227721 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nq2b9" event={"ID":"b7ff6ccf-13de-41ac-af06-668eb1165729","Type":"ContainerDied","Data":"b71359769ca939f37fbf2a17fb570c15f83c928936166087a0c4ea7b5c96998d"} Oct 08 19:58:53 crc kubenswrapper[4750]: I1008 19:58:53.227771 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b71359769ca939f37fbf2a17fb570c15f83c928936166087a0c4ea7b5c96998d" Oct 08 19:58:53 crc kubenswrapper[4750]: I1008 19:58:53.227865 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nq2b9" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.330618 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-158b-account-create-5l24x"] Oct 08 19:58:59 crc kubenswrapper[4750]: E1008 19:58:59.331854 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ff6ccf-13de-41ac-af06-668eb1165729" containerName="mariadb-database-create" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.331869 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ff6ccf-13de-41ac-af06-668eb1165729" containerName="mariadb-database-create" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.332136 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7ff6ccf-13de-41ac-af06-668eb1165729" containerName="mariadb-database-create" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.333110 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-158b-account-create-5l24x" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.337854 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.356576 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-158b-account-create-5l24x"] Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.423392 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbktr\" (UniqueName: \"kubernetes.io/projected/b0ad36c3-6bb5-4024-b62c-cd806b8172d7-kube-api-access-dbktr\") pod \"aodh-158b-account-create-5l24x\" (UID: \"b0ad36c3-6bb5-4024-b62c-cd806b8172d7\") " pod="openstack/aodh-158b-account-create-5l24x" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.536384 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbktr\" (UniqueName: \"kubernetes.io/projected/b0ad36c3-6bb5-4024-b62c-cd806b8172d7-kube-api-access-dbktr\") pod \"aodh-158b-account-create-5l24x\" (UID: \"b0ad36c3-6bb5-4024-b62c-cd806b8172d7\") " pod="openstack/aodh-158b-account-create-5l24x" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.563406 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbktr\" (UniqueName: \"kubernetes.io/projected/b0ad36c3-6bb5-4024-b62c-cd806b8172d7-kube-api-access-dbktr\") pod \"aodh-158b-account-create-5l24x\" (UID: \"b0ad36c3-6bb5-4024-b62c-cd806b8172d7\") " pod="openstack/aodh-158b-account-create-5l24x" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.673737 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-158b-account-create-5l24x" Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.707383 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:58:59 crc kubenswrapper[4750]: I1008 19:58:59.707480 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:59:00 crc kubenswrapper[4750]: I1008 19:59:00.174229 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-158b-account-create-5l24x"] Oct 08 19:59:00 crc kubenswrapper[4750]: I1008 19:59:00.343071 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-158b-account-create-5l24x" event={"ID":"b0ad36c3-6bb5-4024-b62c-cd806b8172d7","Type":"ContainerStarted","Data":"5f73b431078c1e739917da271d6cc7777620fefc3ae3b9b29f45c466b4c1b5ab"} Oct 08 19:59:01 crc kubenswrapper[4750]: I1008 19:59:01.366853 4750 generic.go:334] "Generic (PLEG): container finished" podID="b0ad36c3-6bb5-4024-b62c-cd806b8172d7" containerID="d6eecc244c9abd051f75fe36a9f49fb502f126ab11552c2fe7b0f1806646ddae" exitCode=0 Oct 08 19:59:01 crc kubenswrapper[4750]: I1008 19:59:01.367007 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-158b-account-create-5l24x" event={"ID":"b0ad36c3-6bb5-4024-b62c-cd806b8172d7","Type":"ContainerDied","Data":"d6eecc244c9abd051f75fe36a9f49fb502f126ab11552c2fe7b0f1806646ddae"} Oct 08 19:59:02 crc kubenswrapper[4750]: I1008 19:59:02.871799 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-158b-account-create-5l24x" Oct 08 19:59:02 crc kubenswrapper[4750]: I1008 19:59:02.930978 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbktr\" (UniqueName: \"kubernetes.io/projected/b0ad36c3-6bb5-4024-b62c-cd806b8172d7-kube-api-access-dbktr\") pod \"b0ad36c3-6bb5-4024-b62c-cd806b8172d7\" (UID: \"b0ad36c3-6bb5-4024-b62c-cd806b8172d7\") " Oct 08 19:59:02 crc kubenswrapper[4750]: I1008 19:59:02.943080 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ad36c3-6bb5-4024-b62c-cd806b8172d7-kube-api-access-dbktr" (OuterVolumeSpecName: "kube-api-access-dbktr") pod "b0ad36c3-6bb5-4024-b62c-cd806b8172d7" (UID: "b0ad36c3-6bb5-4024-b62c-cd806b8172d7"). InnerVolumeSpecName "kube-api-access-dbktr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:59:03 crc kubenswrapper[4750]: I1008 19:59:03.035434 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbktr\" (UniqueName: \"kubernetes.io/projected/b0ad36c3-6bb5-4024-b62c-cd806b8172d7-kube-api-access-dbktr\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:03 crc kubenswrapper[4750]: I1008 19:59:03.400021 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-158b-account-create-5l24x" event={"ID":"b0ad36c3-6bb5-4024-b62c-cd806b8172d7","Type":"ContainerDied","Data":"5f73b431078c1e739917da271d6cc7777620fefc3ae3b9b29f45c466b4c1b5ab"} Oct 08 19:59:03 crc kubenswrapper[4750]: I1008 19:59:03.400099 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f73b431078c1e739917da271d6cc7777620fefc3ae3b9b29f45c466b4c1b5ab" Oct 08 19:59:03 crc kubenswrapper[4750]: I1008 19:59:03.400202 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-158b-account-create-5l24x" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.774734 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-cmw5k"] Oct 08 19:59:04 crc kubenswrapper[4750]: E1008 19:59:04.776483 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ad36c3-6bb5-4024-b62c-cd806b8172d7" containerName="mariadb-account-create" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.776511 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ad36c3-6bb5-4024-b62c-cd806b8172d7" containerName="mariadb-account-create" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.777153 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ad36c3-6bb5-4024-b62c-cd806b8172d7" containerName="mariadb-account-create" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.778769 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.785832 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-kr56s" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.786175 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.787299 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.791932 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-config-data\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.792050 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-combined-ca-bundle\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.792336 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-scripts\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.792581 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jthlx\" (UniqueName: \"kubernetes.io/projected/d47a90c8-a408-4b44-a6f5-e65897cce31b-kube-api-access-jthlx\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.814056 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-cmw5k"] Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.893615 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-scripts\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.893732 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jthlx\" (UniqueName: \"kubernetes.io/projected/d47a90c8-a408-4b44-a6f5-e65897cce31b-kube-api-access-jthlx\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.893802 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-config-data\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.893845 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-combined-ca-bundle\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.901676 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-combined-ca-bundle\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.912688 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-config-data\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.913793 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jthlx\" (UniqueName: \"kubernetes.io/projected/d47a90c8-a408-4b44-a6f5-e65897cce31b-kube-api-access-jthlx\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:04 crc kubenswrapper[4750]: I1008 19:59:04.923741 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-scripts\") pod \"aodh-db-sync-cmw5k\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:05 crc kubenswrapper[4750]: I1008 19:59:05.112053 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:05 crc kubenswrapper[4750]: I1008 19:59:05.690698 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-cmw5k"] Oct 08 19:59:06 crc kubenswrapper[4750]: I1008 19:59:06.448267 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-cmw5k" event={"ID":"d47a90c8-a408-4b44-a6f5-e65897cce31b","Type":"ContainerStarted","Data":"4495d3362c729b14379260694ea6ada97d9dcc7d4188c277065b8a1c5f6ff6c6"} Oct 08 19:59:08 crc kubenswrapper[4750]: I1008 19:59:08.052635 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 19:59:11 crc kubenswrapper[4750]: I1008 19:59:11.519787 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-cmw5k" event={"ID":"d47a90c8-a408-4b44-a6f5-e65897cce31b","Type":"ContainerStarted","Data":"3ce0b45918529d74153ef2a496bf4808f6ac6063700592aa4a2398c3c8f08a0e"} Oct 08 19:59:11 crc kubenswrapper[4750]: I1008 19:59:11.554297 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-cmw5k" podStartSLOduration=2.652560814 podStartE2EDuration="7.5542636s" podCreationTimestamp="2025-10-08 19:59:04 +0000 UTC" firstStartedPulling="2025-10-08 19:59:05.698972353 +0000 UTC m=+6501.611943366" lastFinishedPulling="2025-10-08 19:59:10.600675139 +0000 UTC m=+6506.513646152" observedRunningTime="2025-10-08 19:59:11.540200142 +0000 UTC m=+6507.453171165" watchObservedRunningTime="2025-10-08 19:59:11.5542636 +0000 UTC m=+6507.467234673" Oct 08 19:59:13 crc kubenswrapper[4750]: I1008 19:59:13.568398 4750 generic.go:334] "Generic (PLEG): container finished" podID="d47a90c8-a408-4b44-a6f5-e65897cce31b" containerID="3ce0b45918529d74153ef2a496bf4808f6ac6063700592aa4a2398c3c8f08a0e" exitCode=0 Oct 08 19:59:13 crc kubenswrapper[4750]: I1008 19:59:13.568512 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-cmw5k" event={"ID":"d47a90c8-a408-4b44-a6f5-e65897cce31b","Type":"ContainerDied","Data":"3ce0b45918529d74153ef2a496bf4808f6ac6063700592aa4a2398c3c8f08a0e"} Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.129918 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.230590 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-combined-ca-bundle\") pod \"d47a90c8-a408-4b44-a6f5-e65897cce31b\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.231207 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jthlx\" (UniqueName: \"kubernetes.io/projected/d47a90c8-a408-4b44-a6f5-e65897cce31b-kube-api-access-jthlx\") pod \"d47a90c8-a408-4b44-a6f5-e65897cce31b\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.231258 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-config-data\") pod \"d47a90c8-a408-4b44-a6f5-e65897cce31b\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.231473 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-scripts\") pod \"d47a90c8-a408-4b44-a6f5-e65897cce31b\" (UID: \"d47a90c8-a408-4b44-a6f5-e65897cce31b\") " Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.241941 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d47a90c8-a408-4b44-a6f5-e65897cce31b-kube-api-access-jthlx" (OuterVolumeSpecName: "kube-api-access-jthlx") pod "d47a90c8-a408-4b44-a6f5-e65897cce31b" (UID: "d47a90c8-a408-4b44-a6f5-e65897cce31b"). InnerVolumeSpecName "kube-api-access-jthlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.244887 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-scripts" (OuterVolumeSpecName: "scripts") pod "d47a90c8-a408-4b44-a6f5-e65897cce31b" (UID: "d47a90c8-a408-4b44-a6f5-e65897cce31b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.290433 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d47a90c8-a408-4b44-a6f5-e65897cce31b" (UID: "d47a90c8-a408-4b44-a6f5-e65897cce31b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.294937 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-config-data" (OuterVolumeSpecName: "config-data") pod "d47a90c8-a408-4b44-a6f5-e65897cce31b" (UID: "d47a90c8-a408-4b44-a6f5-e65897cce31b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.335669 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.335715 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.335733 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jthlx\" (UniqueName: \"kubernetes.io/projected/d47a90c8-a408-4b44-a6f5-e65897cce31b-kube-api-access-jthlx\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.335758 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d47a90c8-a408-4b44-a6f5-e65897cce31b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.596571 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-cmw5k" event={"ID":"d47a90c8-a408-4b44-a6f5-e65897cce31b","Type":"ContainerDied","Data":"4495d3362c729b14379260694ea6ada97d9dcc7d4188c277065b8a1c5f6ff6c6"} Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.596628 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4495d3362c729b14379260694ea6ada97d9dcc7d4188c277065b8a1c5f6ff6c6" Oct 08 19:59:15 crc kubenswrapper[4750]: I1008 19:59:15.596751 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-cmw5k" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.890565 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 08 19:59:19 crc kubenswrapper[4750]: E1008 19:59:19.892144 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47a90c8-a408-4b44-a6f5-e65897cce31b" containerName="aodh-db-sync" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.892164 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47a90c8-a408-4b44-a6f5-e65897cce31b" containerName="aodh-db-sync" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.892455 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="d47a90c8-a408-4b44-a6f5-e65897cce31b" containerName="aodh-db-sync" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.895407 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.902182 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.902489 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-kr56s" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.903238 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.923295 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.962196 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-combined-ca-bundle\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.962507 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-scripts\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.962721 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-config-data\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:19 crc kubenswrapper[4750]: I1008 19:59:19.962815 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnws\" (UniqueName: \"kubernetes.io/projected/100e9e30-b2db-45f0-afe1-785b450f8382-kube-api-access-fnnws\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.071654 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-scripts\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.071892 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-config-data\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.071946 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnws\" (UniqueName: \"kubernetes.io/projected/100e9e30-b2db-45f0-afe1-785b450f8382-kube-api-access-fnnws\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.072255 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-combined-ca-bundle\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.080127 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-combined-ca-bundle\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.082503 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-scripts\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.086746 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100e9e30-b2db-45f0-afe1-785b450f8382-config-data\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.099625 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnws\" (UniqueName: \"kubernetes.io/projected/100e9e30-b2db-45f0-afe1-785b450f8382-kube-api-access-fnnws\") pod \"aodh-0\" (UID: \"100e9e30-b2db-45f0-afe1-785b450f8382\") " pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.225063 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.873657 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 08 19:59:20 crc kubenswrapper[4750]: I1008 19:59:20.887015 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 19:59:21 crc kubenswrapper[4750]: I1008 19:59:21.674890 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"100e9e30-b2db-45f0-afe1-785b450f8382","Type":"ContainerStarted","Data":"9301c158c4e8d738e0e51a94ab316e2998ef92aa7c5c8ef5ebfe8384d1fbc43a"} Oct 08 19:59:21 crc kubenswrapper[4750]: I1008 19:59:21.676745 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"100e9e30-b2db-45f0-afe1-785b450f8382","Type":"ContainerStarted","Data":"4741a1a3c6f0fdd4addf59ddeb8fc57e8235615d13cea1bf2bcdce770edf5c76"} Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.384635 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.385524 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="ceilometer-central-agent" containerID="cri-o://cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e" gracePeriod=30 Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.386320 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="proxy-httpd" containerID="cri-o://7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4" gracePeriod=30 Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.386392 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="sg-core" containerID="cri-o://de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0" gracePeriod=30 Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.386464 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="ceilometer-notification-agent" containerID="cri-o://a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885" gracePeriod=30 Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.693099 4750 generic.go:334] "Generic (PLEG): container finished" podID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerID="7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4" exitCode=0 Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.693147 4750 generic.go:334] "Generic (PLEG): container finished" podID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerID="de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0" exitCode=2 Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.693177 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerDied","Data":"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4"} Oct 08 19:59:22 crc kubenswrapper[4750]: I1008 19:59:22.693243 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerDied","Data":"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0"} Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.714935 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.729976 4750 generic.go:334] "Generic (PLEG): container finished" podID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerID="a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885" exitCode=0 Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.730013 4750 generic.go:334] "Generic (PLEG): container finished" podID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerID="cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e" exitCode=0 Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.730026 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerDied","Data":"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885"} Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.730103 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerDied","Data":"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e"} Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.730125 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69935543-c19c-4c1f-9f4c-d17ac44f6597","Type":"ContainerDied","Data":"c1f09a2d43711ae1007ad304a62dc0ca37fc35f71e8d0b7da25a755149529282"} Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.730144 4750 scope.go:117] "RemoveContainer" containerID="7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.743725 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"100e9e30-b2db-45f0-afe1-785b450f8382","Type":"ContainerStarted","Data":"697da5b1d6c6f3ef3ba57dd8f741fdfb64309a09812fd77cea8f983f04a5625b"} Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.808001 4750 scope.go:117] "RemoveContainer" containerID="de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.809512 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-config-data\") pod \"69935543-c19c-4c1f-9f4c-d17ac44f6597\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.809668 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kkz9\" (UniqueName: \"kubernetes.io/projected/69935543-c19c-4c1f-9f4c-d17ac44f6597-kube-api-access-6kkz9\") pod \"69935543-c19c-4c1f-9f4c-d17ac44f6597\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.809729 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-combined-ca-bundle\") pod \"69935543-c19c-4c1f-9f4c-d17ac44f6597\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.809797 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-log-httpd\") pod \"69935543-c19c-4c1f-9f4c-d17ac44f6597\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.809945 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-scripts\") pod \"69935543-c19c-4c1f-9f4c-d17ac44f6597\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.810153 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-sg-core-conf-yaml\") pod \"69935543-c19c-4c1f-9f4c-d17ac44f6597\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.810197 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-run-httpd\") pod \"69935543-c19c-4c1f-9f4c-d17ac44f6597\" (UID: \"69935543-c19c-4c1f-9f4c-d17ac44f6597\") " Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.820102 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69935543-c19c-4c1f-9f4c-d17ac44f6597" (UID: "69935543-c19c-4c1f-9f4c-d17ac44f6597"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.820998 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69935543-c19c-4c1f-9f4c-d17ac44f6597" (UID: "69935543-c19c-4c1f-9f4c-d17ac44f6597"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.871820 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-scripts" (OuterVolumeSpecName: "scripts") pod "69935543-c19c-4c1f-9f4c-d17ac44f6597" (UID: "69935543-c19c-4c1f-9f4c-d17ac44f6597"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.878122 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69935543-c19c-4c1f-9f4c-d17ac44f6597-kube-api-access-6kkz9" (OuterVolumeSpecName: "kube-api-access-6kkz9") pod "69935543-c19c-4c1f-9f4c-d17ac44f6597" (UID: "69935543-c19c-4c1f-9f4c-d17ac44f6597"). InnerVolumeSpecName "kube-api-access-6kkz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.918413 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.918462 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kkz9\" (UniqueName: \"kubernetes.io/projected/69935543-c19c-4c1f-9f4c-d17ac44f6597-kube-api-access-6kkz9\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.918474 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69935543-c19c-4c1f-9f4c-d17ac44f6597-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.918488 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.948797 4750 scope.go:117] "RemoveContainer" containerID="a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.951799 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69935543-c19c-4c1f-9f4c-d17ac44f6597" (UID: "69935543-c19c-4c1f-9f4c-d17ac44f6597"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:59:23 crc kubenswrapper[4750]: I1008 19:59:23.992739 4750 scope.go:117] "RemoveContainer" containerID="cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.018157 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69935543-c19c-4c1f-9f4c-d17ac44f6597" (UID: "69935543-c19c-4c1f-9f4c-d17ac44f6597"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.021142 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.021289 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.031901 4750 scope.go:117] "RemoveContainer" containerID="7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4" Oct 08 19:59:24 crc kubenswrapper[4750]: E1008 19:59:24.032381 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4\": container with ID starting with 7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4 not found: ID does not exist" containerID="7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.032446 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4"} err="failed to get container status \"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4\": rpc error: code = NotFound desc = could not find container \"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4\": container with ID starting with 7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4 not found: ID does not exist" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.032484 4750 scope.go:117] "RemoveContainer" containerID="de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0" Oct 08 19:59:24 crc kubenswrapper[4750]: E1008 19:59:24.032999 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0\": container with ID starting with de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0 not found: ID does not exist" containerID="de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.033044 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0"} err="failed to get container status \"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0\": rpc error: code = NotFound desc = could not find container \"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0\": container with ID starting with de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0 not found: ID does not exist" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.033076 4750 scope.go:117] "RemoveContainer" containerID="a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885" Oct 08 19:59:24 crc kubenswrapper[4750]: E1008 19:59:24.033367 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885\": container with ID starting with a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885 not found: ID does not exist" containerID="a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.033389 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885"} err="failed to get container status \"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885\": rpc error: code = NotFound desc = could not find container \"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885\": container with ID starting with a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885 not found: ID does not exist" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.033430 4750 scope.go:117] "RemoveContainer" containerID="cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e" Oct 08 19:59:24 crc kubenswrapper[4750]: E1008 19:59:24.033916 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e\": container with ID starting with cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e not found: ID does not exist" containerID="cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.033943 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e"} err="failed to get container status \"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e\": rpc error: code = NotFound desc = could not find container \"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e\": container with ID starting with cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e not found: ID does not exist" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.033977 4750 scope.go:117] "RemoveContainer" containerID="7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.036267 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4"} err="failed to get container status \"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4\": rpc error: code = NotFound desc = could not find container \"7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4\": container with ID starting with 7b6fcfa010f54de173075114eeaf246229ef02d6d67830d3f8f43251f6ae55c4 not found: ID does not exist" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.036285 4750 scope.go:117] "RemoveContainer" containerID="de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.036565 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0"} err="failed to get container status \"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0\": rpc error: code = NotFound desc = could not find container \"de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0\": container with ID starting with de89abcbc19a64006e30fdd9b24e1707b8d326f67ba10627a412587de408f4f0 not found: ID does not exist" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.036591 4750 scope.go:117] "RemoveContainer" containerID="a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.038910 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885"} err="failed to get container status \"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885\": rpc error: code = NotFound desc = could not find container \"a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885\": container with ID starting with a6617b4457feb8f66a14c8d8c3a3ea83f8d0468b6893b02bcfb9912124d10885 not found: ID does not exist" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.038989 4750 scope.go:117] "RemoveContainer" containerID="cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.039418 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e"} err="failed to get container status \"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e\": rpc error: code = NotFound desc = could not find container \"cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e\": container with ID starting with cba10b738bcb7df83d39b4592d37851cd452f7068475bc74f652a499ef43440e not found: ID does not exist" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.074664 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-config-data" (OuterVolumeSpecName: "config-data") pod "69935543-c19c-4c1f-9f4c-d17ac44f6597" (UID: "69935543-c19c-4c1f-9f4c-d17ac44f6597"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.125718 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69935543-c19c-4c1f-9f4c-d17ac44f6597-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.773858 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.935176 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.944686 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.985049 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:59:24 crc kubenswrapper[4750]: E1008 19:59:24.985629 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="ceilometer-notification-agent" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.985651 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="ceilometer-notification-agent" Oct 08 19:59:24 crc kubenswrapper[4750]: E1008 19:59:24.985670 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="ceilometer-central-agent" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.985679 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="ceilometer-central-agent" Oct 08 19:59:24 crc kubenswrapper[4750]: E1008 19:59:24.985699 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="proxy-httpd" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.985706 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="proxy-httpd" Oct 08 19:59:24 crc kubenswrapper[4750]: E1008 19:59:24.985758 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="sg-core" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.985764 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="sg-core" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.985957 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="proxy-httpd" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.985974 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="sg-core" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.985993 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="ceilometer-central-agent" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.986004 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" containerName="ceilometer-notification-agent" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.988109 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.994482 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 19:59:24 crc kubenswrapper[4750]: I1008 19:59:24.994799 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.004304 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.151815 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnvxq\" (UniqueName: \"kubernetes.io/projected/984335f4-0d17-4391-947f-ad63864a588d-kube-api-access-hnvxq\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.151884 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-run-httpd\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.151923 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.152398 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.152461 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-log-httpd\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.152658 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-config-data\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.152720 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-scripts\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.254909 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.254975 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-log-httpd\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.255037 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-config-data\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.255058 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-scripts\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.255836 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-log-httpd\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.257111 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnvxq\" (UniqueName: \"kubernetes.io/projected/984335f4-0d17-4391-947f-ad63864a588d-kube-api-access-hnvxq\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.257184 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-run-httpd\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.257213 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.257600 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-run-httpd\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.263188 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-config-data\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.264956 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.265096 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.287111 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-scripts\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.287367 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnvxq\" (UniqueName: \"kubernetes.io/projected/984335f4-0d17-4391-947f-ad63864a588d-kube-api-access-hnvxq\") pod \"ceilometer-0\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.319260 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.787707 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"100e9e30-b2db-45f0-afe1-785b450f8382","Type":"ContainerStarted","Data":"f1fad2a39cedbdcd76d07c68535eb74ffb18cb4875b2a882353ce4525cb2aa53"} Oct 08 19:59:25 crc kubenswrapper[4750]: I1008 19:59:25.895847 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 19:59:25 crc kubenswrapper[4750]: W1008 19:59:25.901717 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod984335f4_0d17_4391_947f_ad63864a588d.slice/crio-fa73753f9713902d3ef1965f151cd86733fac45d2a5e8236ed4200f55d333667 WatchSource:0}: Error finding container fa73753f9713902d3ef1965f151cd86733fac45d2a5e8236ed4200f55d333667: Status 404 returned error can't find the container with id fa73753f9713902d3ef1965f151cd86733fac45d2a5e8236ed4200f55d333667 Oct 08 19:59:26 crc kubenswrapper[4750]: I1008 19:59:26.751240 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69935543-c19c-4c1f-9f4c-d17ac44f6597" path="/var/lib/kubelet/pods/69935543-c19c-4c1f-9f4c-d17ac44f6597/volumes" Oct 08 19:59:26 crc kubenswrapper[4750]: I1008 19:59:26.799592 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerStarted","Data":"fa73753f9713902d3ef1965f151cd86733fac45d2a5e8236ed4200f55d333667"} Oct 08 19:59:27 crc kubenswrapper[4750]: I1008 19:59:27.817273 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerStarted","Data":"f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3"} Oct 08 19:59:27 crc kubenswrapper[4750]: I1008 19:59:27.823513 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"100e9e30-b2db-45f0-afe1-785b450f8382","Type":"ContainerStarted","Data":"f7a1659d03dfc873bc1879af75be1ab56bb3aa41934e06a637f8b89efdbf797e"} Oct 08 19:59:27 crc kubenswrapper[4750]: I1008 19:59:27.866760 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.754974288 podStartE2EDuration="8.866731313s" podCreationTimestamp="2025-10-08 19:59:19 +0000 UTC" firstStartedPulling="2025-10-08 19:59:20.886725928 +0000 UTC m=+6516.799696941" lastFinishedPulling="2025-10-08 19:59:26.998482953 +0000 UTC m=+6522.911453966" observedRunningTime="2025-10-08 19:59:27.855196377 +0000 UTC m=+6523.768167410" watchObservedRunningTime="2025-10-08 19:59:27.866731313 +0000 UTC m=+6523.779702326" Oct 08 19:59:28 crc kubenswrapper[4750]: I1008 19:59:28.841767 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerStarted","Data":"51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3"} Oct 08 19:59:28 crc kubenswrapper[4750]: I1008 19:59:28.842226 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerStarted","Data":"713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49"} Oct 08 19:59:29 crc kubenswrapper[4750]: I1008 19:59:29.706819 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:59:29 crc kubenswrapper[4750]: I1008 19:59:29.706880 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:59:30 crc kubenswrapper[4750]: I1008 19:59:30.865262 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerStarted","Data":"3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426"} Oct 08 19:59:30 crc kubenswrapper[4750]: I1008 19:59:30.866221 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 19:59:34 crc kubenswrapper[4750]: I1008 19:59:34.267807 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.908426973 podStartE2EDuration="10.267759137s" podCreationTimestamp="2025-10-08 19:59:24 +0000 UTC" firstStartedPulling="2025-10-08 19:59:25.907640955 +0000 UTC m=+6521.820611968" lastFinishedPulling="2025-10-08 19:59:30.266973079 +0000 UTC m=+6526.179944132" observedRunningTime="2025-10-08 19:59:30.903694918 +0000 UTC m=+6526.816665951" watchObservedRunningTime="2025-10-08 19:59:34.267759137 +0000 UTC m=+6530.180730170" Oct 08 19:59:34 crc kubenswrapper[4750]: I1008 19:59:34.274148 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-7chht"] Oct 08 19:59:34 crc kubenswrapper[4750]: I1008 19:59:34.275774 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7chht" Oct 08 19:59:34 crc kubenswrapper[4750]: I1008 19:59:34.291021 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-7chht"] Oct 08 19:59:34 crc kubenswrapper[4750]: I1008 19:59:34.421752 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q6mb\" (UniqueName: \"kubernetes.io/projected/5173b754-515f-46f1-82bb-9376da88bb9b-kube-api-access-4q6mb\") pod \"manila-db-create-7chht\" (UID: \"5173b754-515f-46f1-82bb-9376da88bb9b\") " pod="openstack/manila-db-create-7chht" Oct 08 19:59:34 crc kubenswrapper[4750]: I1008 19:59:34.523885 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q6mb\" (UniqueName: \"kubernetes.io/projected/5173b754-515f-46f1-82bb-9376da88bb9b-kube-api-access-4q6mb\") pod \"manila-db-create-7chht\" (UID: \"5173b754-515f-46f1-82bb-9376da88bb9b\") " pod="openstack/manila-db-create-7chht" Oct 08 19:59:34 crc kubenswrapper[4750]: I1008 19:59:34.550810 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q6mb\" (UniqueName: \"kubernetes.io/projected/5173b754-515f-46f1-82bb-9376da88bb9b-kube-api-access-4q6mb\") pod \"manila-db-create-7chht\" (UID: \"5173b754-515f-46f1-82bb-9376da88bb9b\") " pod="openstack/manila-db-create-7chht" Oct 08 19:59:34 crc kubenswrapper[4750]: I1008 19:59:34.626749 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7chht" Oct 08 19:59:35 crc kubenswrapper[4750]: I1008 19:59:35.259301 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-7chht"] Oct 08 19:59:35 crc kubenswrapper[4750]: I1008 19:59:35.928643 4750 generic.go:334] "Generic (PLEG): container finished" podID="5173b754-515f-46f1-82bb-9376da88bb9b" containerID="0c1dc69f437d5597a09943d41a48cb08b27e352bcae2b7a5be9b32c77e61092b" exitCode=0 Oct 08 19:59:35 crc kubenswrapper[4750]: I1008 19:59:35.928776 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7chht" event={"ID":"5173b754-515f-46f1-82bb-9376da88bb9b","Type":"ContainerDied","Data":"0c1dc69f437d5597a09943d41a48cb08b27e352bcae2b7a5be9b32c77e61092b"} Oct 08 19:59:35 crc kubenswrapper[4750]: I1008 19:59:35.929149 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7chht" event={"ID":"5173b754-515f-46f1-82bb-9376da88bb9b","Type":"ContainerStarted","Data":"9871a19b84bcfc44251e010e6444958467e269bf3cf7f58f5c36a235c0487e4f"} Oct 08 19:59:37 crc kubenswrapper[4750]: I1008 19:59:37.441454 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7chht" Oct 08 19:59:37 crc kubenswrapper[4750]: I1008 19:59:37.607448 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q6mb\" (UniqueName: \"kubernetes.io/projected/5173b754-515f-46f1-82bb-9376da88bb9b-kube-api-access-4q6mb\") pod \"5173b754-515f-46f1-82bb-9376da88bb9b\" (UID: \"5173b754-515f-46f1-82bb-9376da88bb9b\") " Oct 08 19:59:37 crc kubenswrapper[4750]: I1008 19:59:37.615865 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5173b754-515f-46f1-82bb-9376da88bb9b-kube-api-access-4q6mb" (OuterVolumeSpecName: "kube-api-access-4q6mb") pod "5173b754-515f-46f1-82bb-9376da88bb9b" (UID: "5173b754-515f-46f1-82bb-9376da88bb9b"). InnerVolumeSpecName "kube-api-access-4q6mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:59:37 crc kubenswrapper[4750]: I1008 19:59:37.710456 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q6mb\" (UniqueName: \"kubernetes.io/projected/5173b754-515f-46f1-82bb-9376da88bb9b-kube-api-access-4q6mb\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:37 crc kubenswrapper[4750]: I1008 19:59:37.954682 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7chht" event={"ID":"5173b754-515f-46f1-82bb-9376da88bb9b","Type":"ContainerDied","Data":"9871a19b84bcfc44251e010e6444958467e269bf3cf7f58f5c36a235c0487e4f"} Oct 08 19:59:37 crc kubenswrapper[4750]: I1008 19:59:37.954730 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7chht" Oct 08 19:59:37 crc kubenswrapper[4750]: I1008 19:59:37.954751 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9871a19b84bcfc44251e010e6444958467e269bf3cf7f58f5c36a235c0487e4f" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.376629 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-8e2a-account-create-lrccx"] Oct 08 19:59:44 crc kubenswrapper[4750]: E1008 19:59:44.377975 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5173b754-515f-46f1-82bb-9376da88bb9b" containerName="mariadb-database-create" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.377993 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="5173b754-515f-46f1-82bb-9376da88bb9b" containerName="mariadb-database-create" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.378351 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="5173b754-515f-46f1-82bb-9376da88bb9b" containerName="mariadb-database-create" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.379283 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-8e2a-account-create-lrccx" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.382662 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.389113 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-8e2a-account-create-lrccx"] Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.486652 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wrxn\" (UniqueName: \"kubernetes.io/projected/bdc6fd03-081e-4966-9c71-c6e937ad4fca-kube-api-access-4wrxn\") pod \"manila-8e2a-account-create-lrccx\" (UID: \"bdc6fd03-081e-4966-9c71-c6e937ad4fca\") " pod="openstack/manila-8e2a-account-create-lrccx" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.590075 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wrxn\" (UniqueName: \"kubernetes.io/projected/bdc6fd03-081e-4966-9c71-c6e937ad4fca-kube-api-access-4wrxn\") pod \"manila-8e2a-account-create-lrccx\" (UID: \"bdc6fd03-081e-4966-9c71-c6e937ad4fca\") " pod="openstack/manila-8e2a-account-create-lrccx" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.618543 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wrxn\" (UniqueName: \"kubernetes.io/projected/bdc6fd03-081e-4966-9c71-c6e937ad4fca-kube-api-access-4wrxn\") pod \"manila-8e2a-account-create-lrccx\" (UID: \"bdc6fd03-081e-4966-9c71-c6e937ad4fca\") " pod="openstack/manila-8e2a-account-create-lrccx" Oct 08 19:59:44 crc kubenswrapper[4750]: I1008 19:59:44.717056 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-8e2a-account-create-lrccx" Oct 08 19:59:45 crc kubenswrapper[4750]: I1008 19:59:45.253367 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-8e2a-account-create-lrccx"] Oct 08 19:59:45 crc kubenswrapper[4750]: W1008 19:59:45.263405 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdc6fd03_081e_4966_9c71_c6e937ad4fca.slice/crio-4730392dc07dc725c7de5434b074ec8f115f2dc91de43cb8f63f0beedde4847b WatchSource:0}: Error finding container 4730392dc07dc725c7de5434b074ec8f115f2dc91de43cb8f63f0beedde4847b: Status 404 returned error can't find the container with id 4730392dc07dc725c7de5434b074ec8f115f2dc91de43cb8f63f0beedde4847b Oct 08 19:59:45 crc kubenswrapper[4750]: I1008 19:59:45.271216 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 08 19:59:46 crc kubenswrapper[4750]: I1008 19:59:46.084905 4750 generic.go:334] "Generic (PLEG): container finished" podID="bdc6fd03-081e-4966-9c71-c6e937ad4fca" containerID="173478bda2acf0ec781d4ccb33aacf7e141a6cad10975743093c0bf1188e6298" exitCode=0 Oct 08 19:59:46 crc kubenswrapper[4750]: I1008 19:59:46.084994 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-8e2a-account-create-lrccx" event={"ID":"bdc6fd03-081e-4966-9c71-c6e937ad4fca","Type":"ContainerDied","Data":"173478bda2acf0ec781d4ccb33aacf7e141a6cad10975743093c0bf1188e6298"} Oct 08 19:59:46 crc kubenswrapper[4750]: I1008 19:59:46.085081 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-8e2a-account-create-lrccx" event={"ID":"bdc6fd03-081e-4966-9c71-c6e937ad4fca","Type":"ContainerStarted","Data":"4730392dc07dc725c7de5434b074ec8f115f2dc91de43cb8f63f0beedde4847b"} Oct 08 19:59:47 crc kubenswrapper[4750]: I1008 19:59:47.631287 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-8e2a-account-create-lrccx" Oct 08 19:59:47 crc kubenswrapper[4750]: I1008 19:59:47.768181 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wrxn\" (UniqueName: \"kubernetes.io/projected/bdc6fd03-081e-4966-9c71-c6e937ad4fca-kube-api-access-4wrxn\") pod \"bdc6fd03-081e-4966-9c71-c6e937ad4fca\" (UID: \"bdc6fd03-081e-4966-9c71-c6e937ad4fca\") " Oct 08 19:59:47 crc kubenswrapper[4750]: I1008 19:59:47.779444 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc6fd03-081e-4966-9c71-c6e937ad4fca-kube-api-access-4wrxn" (OuterVolumeSpecName: "kube-api-access-4wrxn") pod "bdc6fd03-081e-4966-9c71-c6e937ad4fca" (UID: "bdc6fd03-081e-4966-9c71-c6e937ad4fca"). InnerVolumeSpecName "kube-api-access-4wrxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 19:59:47 crc kubenswrapper[4750]: I1008 19:59:47.871699 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wrxn\" (UniqueName: \"kubernetes.io/projected/bdc6fd03-081e-4966-9c71-c6e937ad4fca-kube-api-access-4wrxn\") on node \"crc\" DevicePath \"\"" Oct 08 19:59:48 crc kubenswrapper[4750]: I1008 19:59:48.117171 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-8e2a-account-create-lrccx" event={"ID":"bdc6fd03-081e-4966-9c71-c6e937ad4fca","Type":"ContainerDied","Data":"4730392dc07dc725c7de5434b074ec8f115f2dc91de43cb8f63f0beedde4847b"} Oct 08 19:59:48 crc kubenswrapper[4750]: I1008 19:59:48.117239 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-8e2a-account-create-lrccx" Oct 08 19:59:48 crc kubenswrapper[4750]: I1008 19:59:48.117254 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4730392dc07dc725c7de5434b074ec8f115f2dc91de43cb8f63f0beedde4847b" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.666406 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-rslm8"] Oct 08 19:59:49 crc kubenswrapper[4750]: E1008 19:59:49.670815 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc6fd03-081e-4966-9c71-c6e937ad4fca" containerName="mariadb-account-create" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.670854 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc6fd03-081e-4966-9c71-c6e937ad4fca" containerName="mariadb-account-create" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.671365 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc6fd03-081e-4966-9c71-c6e937ad4fca" containerName="mariadb-account-create" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.672538 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.677388 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-zv9fz" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.677627 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.691974 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-rslm8"] Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.820394 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-job-config-data\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.820614 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-config-data\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.820655 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tkk2\" (UniqueName: \"kubernetes.io/projected/ba387095-0fec-48ef-8bc2-05e8a5368d0b-kube-api-access-2tkk2\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.820724 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-combined-ca-bundle\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.923142 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-config-data\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.923201 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tkk2\" (UniqueName: \"kubernetes.io/projected/ba387095-0fec-48ef-8bc2-05e8a5368d0b-kube-api-access-2tkk2\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.923240 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-combined-ca-bundle\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.923389 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-job-config-data\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.939163 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-job-config-data\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.939163 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-combined-ca-bundle\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.939401 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-config-data\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:49 crc kubenswrapper[4750]: I1008 19:59:49.942692 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tkk2\" (UniqueName: \"kubernetes.io/projected/ba387095-0fec-48ef-8bc2-05e8a5368d0b-kube-api-access-2tkk2\") pod \"manila-db-sync-rslm8\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:50 crc kubenswrapper[4750]: I1008 19:59:50.018005 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rslm8" Oct 08 19:59:51 crc kubenswrapper[4750]: I1008 19:59:51.029250 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-rslm8"] Oct 08 19:59:51 crc kubenswrapper[4750]: W1008 19:59:51.035470 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba387095_0fec_48ef_8bc2_05e8a5368d0b.slice/crio-8024e76a74872390ed2d2f222744985a9a0c347168c09290bbd5f08e301a4722 WatchSource:0}: Error finding container 8024e76a74872390ed2d2f222744985a9a0c347168c09290bbd5f08e301a4722: Status 404 returned error can't find the container with id 8024e76a74872390ed2d2f222744985a9a0c347168c09290bbd5f08e301a4722 Oct 08 19:59:51 crc kubenswrapper[4750]: I1008 19:59:51.167325 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rslm8" event={"ID":"ba387095-0fec-48ef-8bc2-05e8a5368d0b","Type":"ContainerStarted","Data":"8024e76a74872390ed2d2f222744985a9a0c347168c09290bbd5f08e301a4722"} Oct 08 19:59:55 crc kubenswrapper[4750]: I1008 19:59:55.330327 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 19:59:56 crc kubenswrapper[4750]: I1008 19:59:56.234943 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rslm8" event={"ID":"ba387095-0fec-48ef-8bc2-05e8a5368d0b","Type":"ContainerStarted","Data":"c28ba5ec12915cff3189165cc711584c0f28c1c7de3a0b38af7f95c2442926d0"} Oct 08 19:59:56 crc kubenswrapper[4750]: I1008 19:59:56.284138 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-rslm8" podStartSLOduration=2.9169961239999997 podStartE2EDuration="7.28410464s" podCreationTimestamp="2025-10-08 19:59:49 +0000 UTC" firstStartedPulling="2025-10-08 19:59:51.041625338 +0000 UTC m=+6546.954596341" lastFinishedPulling="2025-10-08 19:59:55.408733844 +0000 UTC m=+6551.321704857" observedRunningTime="2025-10-08 19:59:56.263643463 +0000 UTC m=+6552.176614536" watchObservedRunningTime="2025-10-08 19:59:56.28410464 +0000 UTC m=+6552.197075723" Oct 08 19:59:58 crc kubenswrapper[4750]: I1008 19:59:58.262686 4750 generic.go:334] "Generic (PLEG): container finished" podID="ba387095-0fec-48ef-8bc2-05e8a5368d0b" containerID="c28ba5ec12915cff3189165cc711584c0f28c1c7de3a0b38af7f95c2442926d0" exitCode=0 Oct 08 19:59:58 crc kubenswrapper[4750]: I1008 19:59:58.262780 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rslm8" event={"ID":"ba387095-0fec-48ef-8bc2-05e8a5368d0b","Type":"ContainerDied","Data":"c28ba5ec12915cff3189165cc711584c0f28c1c7de3a0b38af7f95c2442926d0"} Oct 08 19:59:59 crc kubenswrapper[4750]: I1008 19:59:59.707148 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 19:59:59 crc kubenswrapper[4750]: I1008 19:59:59.708011 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 19:59:59 crc kubenswrapper[4750]: I1008 19:59:59.708085 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 19:59:59 crc kubenswrapper[4750]: I1008 19:59:59.709367 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 19:59:59 crc kubenswrapper[4750]: I1008 19:59:59.709437 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" gracePeriod=600 Oct 08 19:59:59 crc kubenswrapper[4750]: E1008 19:59:59.848229 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 19:59:59 crc kubenswrapper[4750]: I1008 19:59:59.889493 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rslm8" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.014369 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-config-data\") pod \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.014570 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tkk2\" (UniqueName: \"kubernetes.io/projected/ba387095-0fec-48ef-8bc2-05e8a5368d0b-kube-api-access-2tkk2\") pod \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.014651 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-job-config-data\") pod \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.014808 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-combined-ca-bundle\") pod \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\" (UID: \"ba387095-0fec-48ef-8bc2-05e8a5368d0b\") " Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.021482 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "ba387095-0fec-48ef-8bc2-05e8a5368d0b" (UID: "ba387095-0fec-48ef-8bc2-05e8a5368d0b"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.021718 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba387095-0fec-48ef-8bc2-05e8a5368d0b-kube-api-access-2tkk2" (OuterVolumeSpecName: "kube-api-access-2tkk2") pod "ba387095-0fec-48ef-8bc2-05e8a5368d0b" (UID: "ba387095-0fec-48ef-8bc2-05e8a5368d0b"). InnerVolumeSpecName "kube-api-access-2tkk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.025395 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-config-data" (OuterVolumeSpecName: "config-data") pod "ba387095-0fec-48ef-8bc2-05e8a5368d0b" (UID: "ba387095-0fec-48ef-8bc2-05e8a5368d0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.050465 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba387095-0fec-48ef-8bc2-05e8a5368d0b" (UID: "ba387095-0fec-48ef-8bc2-05e8a5368d0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.117827 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tkk2\" (UniqueName: \"kubernetes.io/projected/ba387095-0fec-48ef-8bc2-05e8a5368d0b-kube-api-access-2tkk2\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.118128 4750 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.118145 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.118158 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba387095-0fec-48ef-8bc2-05e8a5368d0b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.165859 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj"] Oct 08 20:00:00 crc kubenswrapper[4750]: E1008 20:00:00.166531 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba387095-0fec-48ef-8bc2-05e8a5368d0b" containerName="manila-db-sync" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.166634 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba387095-0fec-48ef-8bc2-05e8a5368d0b" containerName="manila-db-sync" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.166950 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba387095-0fec-48ef-8bc2-05e8a5368d0b" containerName="manila-db-sync" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.168085 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.171995 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.172043 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.180528 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj"] Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.288046 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" exitCode=0 Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.288139 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9"} Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.288502 4750 scope.go:117] "RemoveContainer" containerID="adabf6821006effe51695ae643fb86b44dd24b6d6a52aca3ce60c41928a0f63e" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.289685 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:00:00 crc kubenswrapper[4750]: E1008 20:00:00.290186 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.291578 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rslm8" event={"ID":"ba387095-0fec-48ef-8bc2-05e8a5368d0b","Type":"ContainerDied","Data":"8024e76a74872390ed2d2f222744985a9a0c347168c09290bbd5f08e301a4722"} Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.291632 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8024e76a74872390ed2d2f222744985a9a0c347168c09290bbd5f08e301a4722" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.291721 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rslm8" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.322669 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f89b3559-369e-49da-962f-a515794c7078-config-volume\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.322946 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f89b3559-369e-49da-962f-a515794c7078-secret-volume\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.322988 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t4kh\" (UniqueName: \"kubernetes.io/projected/f89b3559-369e-49da-962f-a515794c7078-kube-api-access-2t4kh\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.425971 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f89b3559-369e-49da-962f-a515794c7078-secret-volume\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.426034 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t4kh\" (UniqueName: \"kubernetes.io/projected/f89b3559-369e-49da-962f-a515794c7078-kube-api-access-2t4kh\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.426067 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f89b3559-369e-49da-962f-a515794c7078-config-volume\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.427245 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f89b3559-369e-49da-962f-a515794c7078-config-volume\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.435123 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f89b3559-369e-49da-962f-a515794c7078-secret-volume\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.450319 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t4kh\" (UniqueName: \"kubernetes.io/projected/f89b3559-369e-49da-962f-a515794c7078-kube-api-access-2t4kh\") pod \"collect-profiles-29332560-zxpqj\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.493341 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.687206 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.689702 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.692780 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.693729 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.693914 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.694231 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-zv9fz" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.849160 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.883964 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-scripts\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.884442 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.884787 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptzt\" (UniqueName: \"kubernetes.io/projected/d539a620-0411-4bc6-8fbd-9aa900497424-kube-api-access-7ptzt\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.884963 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d539a620-0411-4bc6-8fbd-9aa900497424-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.885324 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-config-data\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.885427 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.941244 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.950195 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.963397 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.993456 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptzt\" (UniqueName: \"kubernetes.io/projected/d539a620-0411-4bc6-8fbd-9aa900497424-kube-api-access-7ptzt\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.993600 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d539a620-0411-4bc6-8fbd-9aa900497424-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.993706 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-config-data\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.993748 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.993838 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-scripts\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.993869 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.995736 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 08 20:00:00 crc kubenswrapper[4750]: I1008 20:00:00.995892 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d539a620-0411-4bc6-8fbd-9aa900497424-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.005731 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-scripts\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.011232 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff4557c77-7lrm6"] Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.018865 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.033181 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-config-data\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.036646 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.037159 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d539a620-0411-4bc6-8fbd-9aa900497424-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.039511 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff4557c77-7lrm6"] Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.067114 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptzt\" (UniqueName: \"kubernetes.io/projected/d539a620-0411-4bc6-8fbd-9aa900497424-kube-api-access-7ptzt\") pod \"manila-scheduler-0\" (UID: \"d539a620-0411-4bc6-8fbd-9aa900497424\") " pod="openstack/manila-scheduler-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.104344 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-dns-svc\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.104889 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-ovsdbserver-sb\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.104927 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-scripts\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.104951 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-config-data\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.116738 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-config\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.116802 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f751df5d-800a-44c8-989f-75cc7face178-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.117093 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.117126 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f751df5d-800a-44c8-989f-75cc7face178-ceph\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.117167 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f751df5d-800a-44c8-989f-75cc7face178-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.117193 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.117222 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77mgz\" (UniqueName: \"kubernetes.io/projected/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-kube-api-access-77mgz\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.117271 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpm4q\" (UniqueName: \"kubernetes.io/projected/f751df5d-800a-44c8-989f-75cc7face178-kube-api-access-mpm4q\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.117301 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-ovsdbserver-nb\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.150760 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.153191 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.156884 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.169371 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221420 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221496 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f751df5d-800a-44c8-989f-75cc7face178-ceph\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221532 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f751df5d-800a-44c8-989f-75cc7face178-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221603 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221628 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77mgz\" (UniqueName: \"kubernetes.io/projected/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-kube-api-access-77mgz\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221651 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpm4q\" (UniqueName: \"kubernetes.io/projected/f751df5d-800a-44c8-989f-75cc7face178-kube-api-access-mpm4q\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221670 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-ovsdbserver-nb\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221720 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-dns-svc\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221784 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-ovsdbserver-sb\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221812 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-scripts\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221844 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-config-data\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221871 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-config\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.221903 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f751df5d-800a-44c8-989f-75cc7face178-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.222222 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f751df5d-800a-44c8-989f-75cc7face178-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.222405 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f751df5d-800a-44c8-989f-75cc7face178-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.223286 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-ovsdbserver-sb\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.223442 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-dns-svc\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.224334 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-config\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.224339 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-ovsdbserver-nb\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.231282 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-config-data\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.232318 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-scripts\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.239099 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f751df5d-800a-44c8-989f-75cc7face178-ceph\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.239682 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.241890 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f751df5d-800a-44c8-989f-75cc7face178-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.264622 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpm4q\" (UniqueName: \"kubernetes.io/projected/f751df5d-800a-44c8-989f-75cc7face178-kube-api-access-mpm4q\") pod \"manila-share-share1-0\" (UID: \"f751df5d-800a-44c8-989f-75cc7face178\") " pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.264655 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj"] Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.296473 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77mgz\" (UniqueName: \"kubernetes.io/projected/b31f8130-a585-4549-ae4c-1b68c0f8fbe9-kube-api-access-77mgz\") pod \"dnsmasq-dns-ff4557c77-7lrm6\" (UID: \"b31f8130-a585-4549-ae4c-1b68c0f8fbe9\") " pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.304762 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.324725 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5fab8c0-47cc-4200-b0fe-215273f5f062-logs\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.324912 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckvhh\" (UniqueName: \"kubernetes.io/projected/f5fab8c0-47cc-4200-b0fe-215273f5f062-kube-api-access-ckvhh\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.325203 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-scripts\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.325635 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5fab8c0-47cc-4200-b0fe-215273f5f062-etc-machine-id\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.325688 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-config-data-custom\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.325733 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.325984 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-config-data\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.330778 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" event={"ID":"f89b3559-369e-49da-962f-a515794c7078","Type":"ContainerStarted","Data":"de4b3c7a2a892f7986912913f9fe781c0dacfbb9268faf86d9ece3eea20fd34e"} Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.357281 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.428168 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5fab8c0-47cc-4200-b0fe-215273f5f062-etc-machine-id\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.428227 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-config-data-custom\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.428252 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.428323 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-config-data\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.428326 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5fab8c0-47cc-4200-b0fe-215273f5f062-etc-machine-id\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.428382 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5fab8c0-47cc-4200-b0fe-215273f5f062-logs\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.428434 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckvhh\" (UniqueName: \"kubernetes.io/projected/f5fab8c0-47cc-4200-b0fe-215273f5f062-kube-api-access-ckvhh\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.428499 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-scripts\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.435109 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-config-data-custom\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.435412 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5fab8c0-47cc-4200-b0fe-215273f5f062-logs\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.436244 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-scripts\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.442204 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.442941 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fab8c0-47cc-4200-b0fe-215273f5f062-config-data\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.454110 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckvhh\" (UniqueName: \"kubernetes.io/projected/f5fab8c0-47cc-4200-b0fe-215273f5f062-kube-api-access-ckvhh\") pod \"manila-api-0\" (UID: \"f5fab8c0-47cc-4200-b0fe-215273f5f062\") " pod="openstack/manila-api-0" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.471862 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:01 crc kubenswrapper[4750]: I1008 20:00:01.481250 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 08 20:00:02 crc kubenswrapper[4750]: I1008 20:00:02.112229 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 08 20:00:02 crc kubenswrapper[4750]: I1008 20:00:02.345340 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d539a620-0411-4bc6-8fbd-9aa900497424","Type":"ContainerStarted","Data":"77739729d63a7dbb14f7b27db0e85f6c7e948e11c77bc407cd7ee158e489dc0b"} Oct 08 20:00:02 crc kubenswrapper[4750]: I1008 20:00:02.349078 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" event={"ID":"f89b3559-369e-49da-962f-a515794c7078","Type":"ContainerStarted","Data":"9e087f41312612c808e335bfc5616bf85ffa006ddebea3bd9fa74a7af13e46e6"} Oct 08 20:00:02 crc kubenswrapper[4750]: I1008 20:00:02.445631 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" podStartSLOduration=2.4456062259999998 podStartE2EDuration="2.445606226s" podCreationTimestamp="2025-10-08 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:00:02.375666315 +0000 UTC m=+6558.288637338" watchObservedRunningTime="2025-10-08 20:00:02.445606226 +0000 UTC m=+6558.358577239" Oct 08 20:00:02 crc kubenswrapper[4750]: I1008 20:00:02.446640 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 08 20:00:02 crc kubenswrapper[4750]: W1008 20:00:02.495160 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb31f8130_a585_4549_ae4c_1b68c0f8fbe9.slice/crio-2319f8187f3ef18d28e5786e12a0bc3ce0396d39f651cfc2864a40fe6b873890 WatchSource:0}: Error finding container 2319f8187f3ef18d28e5786e12a0bc3ce0396d39f651cfc2864a40fe6b873890: Status 404 returned error can't find the container with id 2319f8187f3ef18d28e5786e12a0bc3ce0396d39f651cfc2864a40fe6b873890 Oct 08 20:00:02 crc kubenswrapper[4750]: I1008 20:00:02.498883 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff4557c77-7lrm6"] Oct 08 20:00:02 crc kubenswrapper[4750]: I1008 20:00:02.615829 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 08 20:00:03 crc kubenswrapper[4750]: I1008 20:00:03.416738 4750 generic.go:334] "Generic (PLEG): container finished" podID="f89b3559-369e-49da-962f-a515794c7078" containerID="9e087f41312612c808e335bfc5616bf85ffa006ddebea3bd9fa74a7af13e46e6" exitCode=0 Oct 08 20:00:03 crc kubenswrapper[4750]: I1008 20:00:03.416867 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" event={"ID":"f89b3559-369e-49da-962f-a515794c7078","Type":"ContainerDied","Data":"9e087f41312612c808e335bfc5616bf85ffa006ddebea3bd9fa74a7af13e46e6"} Oct 08 20:00:03 crc kubenswrapper[4750]: I1008 20:00:03.425077 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f5fab8c0-47cc-4200-b0fe-215273f5f062","Type":"ContainerStarted","Data":"304b9e2e6d1c358e3c05760ff558df48a777d6997ad21b43d4787b0340d45dc4"} Oct 08 20:00:03 crc kubenswrapper[4750]: I1008 20:00:03.425142 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f5fab8c0-47cc-4200-b0fe-215273f5f062","Type":"ContainerStarted","Data":"d835d9f02fcb5025ed4eacf646fd88914cb0cb459057439e691975e50eb47fee"} Oct 08 20:00:03 crc kubenswrapper[4750]: I1008 20:00:03.430377 4750 generic.go:334] "Generic (PLEG): container finished" podID="b31f8130-a585-4549-ae4c-1b68c0f8fbe9" containerID="b335ee548f0b2c9adcb81a93a699300515849340f2f937df2a39054ba9d5a3dc" exitCode=0 Oct 08 20:00:03 crc kubenswrapper[4750]: I1008 20:00:03.430502 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" event={"ID":"b31f8130-a585-4549-ae4c-1b68c0f8fbe9","Type":"ContainerDied","Data":"b335ee548f0b2c9adcb81a93a699300515849340f2f937df2a39054ba9d5a3dc"} Oct 08 20:00:03 crc kubenswrapper[4750]: I1008 20:00:03.430542 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" event={"ID":"b31f8130-a585-4549-ae4c-1b68c0f8fbe9","Type":"ContainerStarted","Data":"2319f8187f3ef18d28e5786e12a0bc3ce0396d39f651cfc2864a40fe6b873890"} Oct 08 20:00:03 crc kubenswrapper[4750]: I1008 20:00:03.434372 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f751df5d-800a-44c8-989f-75cc7face178","Type":"ContainerStarted","Data":"7eccb6827da9ec970ac8dd29e5e607fed1dacba9e3ff1e42c54a9ed66ffaa21a"} Oct 08 20:00:04 crc kubenswrapper[4750]: I1008 20:00:04.464766 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" event={"ID":"b31f8130-a585-4549-ae4c-1b68c0f8fbe9","Type":"ContainerStarted","Data":"020d63c14e163767e92dc6ba1c114b26fe1b073142a95e942d085b996f00e396"} Oct 08 20:00:04 crc kubenswrapper[4750]: I1008 20:00:04.465468 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:04 crc kubenswrapper[4750]: I1008 20:00:04.486266 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f5fab8c0-47cc-4200-b0fe-215273f5f062","Type":"ContainerStarted","Data":"0139b03fc855038c36573315afb08b9f067724109d3fb2c37d1561ac4f14d7ab"} Oct 08 20:00:04 crc kubenswrapper[4750]: I1008 20:00:04.486438 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 08 20:00:04 crc kubenswrapper[4750]: I1008 20:00:04.498789 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" podStartSLOduration=4.498759682 podStartE2EDuration="4.498759682s" podCreationTimestamp="2025-10-08 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:00:04.48863361 +0000 UTC m=+6560.401604633" watchObservedRunningTime="2025-10-08 20:00:04.498759682 +0000 UTC m=+6560.411730695" Oct 08 20:00:04 crc kubenswrapper[4750]: I1008 20:00:04.513513 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.513487576 podStartE2EDuration="3.513487576s" podCreationTimestamp="2025-10-08 20:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:00:04.508928493 +0000 UTC m=+6560.421899506" watchObservedRunningTime="2025-10-08 20:00:04.513487576 +0000 UTC m=+6560.426458589" Oct 08 20:00:04 crc kubenswrapper[4750]: I1008 20:00:04.527778 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d539a620-0411-4bc6-8fbd-9aa900497424","Type":"ContainerStarted","Data":"9b4c4fe1d9326fa79e5e4023319a78ae9c72019f51355fd62225c94d359bf7c2"} Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.157045 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.219308 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f89b3559-369e-49da-962f-a515794c7078-secret-volume\") pod \"f89b3559-369e-49da-962f-a515794c7078\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.219805 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t4kh\" (UniqueName: \"kubernetes.io/projected/f89b3559-369e-49da-962f-a515794c7078-kube-api-access-2t4kh\") pod \"f89b3559-369e-49da-962f-a515794c7078\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.219900 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f89b3559-369e-49da-962f-a515794c7078-config-volume\") pod \"f89b3559-369e-49da-962f-a515794c7078\" (UID: \"f89b3559-369e-49da-962f-a515794c7078\") " Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.225240 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89b3559-369e-49da-962f-a515794c7078-config-volume" (OuterVolumeSpecName: "config-volume") pod "f89b3559-369e-49da-962f-a515794c7078" (UID: "f89b3559-369e-49da-962f-a515794c7078"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.253194 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89b3559-369e-49da-962f-a515794c7078-kube-api-access-2t4kh" (OuterVolumeSpecName: "kube-api-access-2t4kh") pod "f89b3559-369e-49da-962f-a515794c7078" (UID: "f89b3559-369e-49da-962f-a515794c7078"). InnerVolumeSpecName "kube-api-access-2t4kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.257566 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89b3559-369e-49da-962f-a515794c7078-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f89b3559-369e-49da-962f-a515794c7078" (UID: "f89b3559-369e-49da-962f-a515794c7078"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.325469 4750 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f89b3559-369e-49da-962f-a515794c7078-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.325516 4750 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f89b3559-369e-49da-962f-a515794c7078-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.325536 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t4kh\" (UniqueName: \"kubernetes.io/projected/f89b3559-369e-49da-962f-a515794c7078-kube-api-access-2t4kh\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.457164 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr"] Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.472159 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332515-t26kr"] Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.558164 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d539a620-0411-4bc6-8fbd-9aa900497424","Type":"ContainerStarted","Data":"81795d4e2d329b33d2b41975a8befad0fc8eee31269d3d00223e8704ff468667"} Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.570216 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.570863 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332560-zxpqj" event={"ID":"f89b3559-369e-49da-962f-a515794c7078","Type":"ContainerDied","Data":"de4b3c7a2a892f7986912913f9fe781c0dacfbb9268faf86d9ece3eea20fd34e"} Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.570934 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de4b3c7a2a892f7986912913f9fe781c0dacfbb9268faf86d9ece3eea20fd34e" Oct 08 20:00:05 crc kubenswrapper[4750]: I1008 20:00:05.616245 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.629343903 podStartE2EDuration="5.616215298s" podCreationTimestamp="2025-10-08 20:00:00 +0000 UTC" firstStartedPulling="2025-10-08 20:00:02.146907303 +0000 UTC m=+6558.059878316" lastFinishedPulling="2025-10-08 20:00:03.133778698 +0000 UTC m=+6559.046749711" observedRunningTime="2025-10-08 20:00:05.591868895 +0000 UTC m=+6561.504839908" watchObservedRunningTime="2025-10-08 20:00:05.616215298 +0000 UTC m=+6561.529186311" Oct 08 20:00:06 crc kubenswrapper[4750]: I1008 20:00:06.756974 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a453be50-65d1-4bc8-a677-42456dc355d5" path="/var/lib/kubelet/pods/a453be50-65d1-4bc8-a677-42456dc355d5/volumes" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.096949 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-54bn2"] Oct 08 20:00:09 crc kubenswrapper[4750]: E1008 20:00:09.098463 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89b3559-369e-49da-962f-a515794c7078" containerName="collect-profiles" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.098480 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89b3559-369e-49da-962f-a515794c7078" containerName="collect-profiles" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.099520 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89b3559-369e-49da-962f-a515794c7078" containerName="collect-profiles" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.101580 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.109969 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54bn2"] Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.163697 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-catalog-content\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.163784 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78w6k\" (UniqueName: \"kubernetes.io/projected/f8d9aac5-c222-4925-9b93-08aecfd610fa-kube-api-access-78w6k\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.164404 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-utilities\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.266824 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-catalog-content\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.266894 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78w6k\" (UniqueName: \"kubernetes.io/projected/f8d9aac5-c222-4925-9b93-08aecfd610fa-kube-api-access-78w6k\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.267032 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-utilities\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.267566 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-utilities\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.267794 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-catalog-content\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.296507 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78w6k\" (UniqueName: \"kubernetes.io/projected/f8d9aac5-c222-4925-9b93-08aecfd610fa-kube-api-access-78w6k\") pod \"redhat-operators-54bn2\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:09 crc kubenswrapper[4750]: I1008 20:00:09.424609 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:10 crc kubenswrapper[4750]: I1008 20:00:10.541623 4750 scope.go:117] "RemoveContainer" containerID="cf0812dd8debcf4c3eab99a904a0e6fbf5573f1be9f45fd5d5ee2781a9548443" Oct 08 20:00:11 crc kubenswrapper[4750]: I1008 20:00:11.358288 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 08 20:00:11 crc kubenswrapper[4750]: I1008 20:00:11.475327 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff4557c77-7lrm6" Oct 08 20:00:11 crc kubenswrapper[4750]: I1008 20:00:11.560403 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f86b88dc-8vbxj"] Oct 08 20:00:11 crc kubenswrapper[4750]: I1008 20:00:11.566873 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" podUID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" containerName="dnsmasq-dns" containerID="cri-o://c6a3294ff68af21a75f4f00b56dcc5ba00d7d9a217bcd8728813c58e619f41ce" gracePeriod=10 Oct 08 20:00:12 crc kubenswrapper[4750]: I1008 20:00:12.681644 4750 generic.go:334] "Generic (PLEG): container finished" podID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" containerID="c6a3294ff68af21a75f4f00b56dcc5ba00d7d9a217bcd8728813c58e619f41ce" exitCode=0 Oct 08 20:00:12 crc kubenswrapper[4750]: I1008 20:00:12.682180 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" event={"ID":"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6","Type":"ContainerDied","Data":"c6a3294ff68af21a75f4f00b56dcc5ba00d7d9a217bcd8728813c58e619f41ce"} Oct 08 20:00:12 crc kubenswrapper[4750]: I1008 20:00:12.926662 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.023777 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-config\") pod \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.023826 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-nb\") pod \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.024177 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-dns-svc\") pod \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.024224 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-sb\") pod \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.024314 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ldlk\" (UniqueName: \"kubernetes.io/projected/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-kube-api-access-7ldlk\") pod \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\" (UID: \"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6\") " Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.076063 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-kube-api-access-7ldlk" (OuterVolumeSpecName: "kube-api-access-7ldlk") pod "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" (UID: "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6"). InnerVolumeSpecName "kube-api-access-7ldlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.128821 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ldlk\" (UniqueName: \"kubernetes.io/projected/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-kube-api-access-7ldlk\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.142748 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" (UID: "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.142939 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" (UID: "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.169260 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" (UID: "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.209769 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-config" (OuterVolumeSpecName: "config") pod "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" (UID: "eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.231646 4750 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.231682 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.231692 4750 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.231701 4750 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6-config\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.281234 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-54bn2"] Oct 08 20:00:13 crc kubenswrapper[4750]: W1008 20:00:13.306717 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8d9aac5_c222_4925_9b93_08aecfd610fa.slice/crio-395f16c4b23f1a2508333b9178ebc29c764a16d987692d08c800016c14b344b0 WatchSource:0}: Error finding container 395f16c4b23f1a2508333b9178ebc29c764a16d987692d08c800016c14b344b0: Status 404 returned error can't find the container with id 395f16c4b23f1a2508333b9178ebc29c764a16d987692d08c800016c14b344b0 Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.698399 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f751df5d-800a-44c8-989f-75cc7face178","Type":"ContainerStarted","Data":"b968911a57acaa7bb95126cc8a7b408423d235aac1725e8277532d17423ddbb0"} Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.710357 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" event={"ID":"eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6","Type":"ContainerDied","Data":"d41ea90c5355c4fb5fa742310a88b11868b56351d33f85eee7f04702ae5bd9dc"} Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.710434 4750 scope.go:117] "RemoveContainer" containerID="c6a3294ff68af21a75f4f00b56dcc5ba00d7d9a217bcd8728813c58e619f41ce" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.710651 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f86b88dc-8vbxj" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.738210 4750 generic.go:334] "Generic (PLEG): container finished" podID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerID="a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544" exitCode=0 Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.738263 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54bn2" event={"ID":"f8d9aac5-c222-4925-9b93-08aecfd610fa","Type":"ContainerDied","Data":"a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544"} Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.738290 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54bn2" event={"ID":"f8d9aac5-c222-4925-9b93-08aecfd610fa","Type":"ContainerStarted","Data":"395f16c4b23f1a2508333b9178ebc29c764a16d987692d08c800016c14b344b0"} Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.767646 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f86b88dc-8vbxj"] Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.775109 4750 scope.go:117] "RemoveContainer" containerID="ccc362b1bd4253f2596be75db638b5fe771ce9a1236308de3cdfa210f77c4669" Oct 08 20:00:13 crc kubenswrapper[4750]: I1008 20:00:13.776898 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86f86b88dc-8vbxj"] Oct 08 20:00:14 crc kubenswrapper[4750]: I1008 20:00:14.758936 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" path="/var/lib/kubelet/pods/eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6/volumes" Oct 08 20:00:14 crc kubenswrapper[4750]: I1008 20:00:14.761225 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:00:14 crc kubenswrapper[4750]: E1008 20:00:14.761526 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:00:14 crc kubenswrapper[4750]: I1008 20:00:14.771738 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f751df5d-800a-44c8-989f-75cc7face178","Type":"ContainerStarted","Data":"e30d796aef699b18b331af411c5743c59ecc8867bfb5d0a8c5b79fba2e28138f"} Oct 08 20:00:14 crc kubenswrapper[4750]: I1008 20:00:14.803381 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.00780178 podStartE2EDuration="14.803357189s" podCreationTimestamp="2025-10-08 20:00:00 +0000 UTC" firstStartedPulling="2025-10-08 20:00:02.455222224 +0000 UTC m=+6558.368193237" lastFinishedPulling="2025-10-08 20:00:12.250777633 +0000 UTC m=+6568.163748646" observedRunningTime="2025-10-08 20:00:14.801848182 +0000 UTC m=+6570.714819215" watchObservedRunningTime="2025-10-08 20:00:14.803357189 +0000 UTC m=+6570.716328212" Oct 08 20:00:15 crc kubenswrapper[4750]: I1008 20:00:15.786929 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54bn2" event={"ID":"f8d9aac5-c222-4925-9b93-08aecfd610fa","Type":"ContainerStarted","Data":"0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf"} Oct 08 20:00:16 crc kubenswrapper[4750]: I1008 20:00:16.154683 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 20:00:16 crc kubenswrapper[4750]: I1008 20:00:16.155107 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="ceilometer-central-agent" containerID="cri-o://f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3" gracePeriod=30 Oct 08 20:00:16 crc kubenswrapper[4750]: I1008 20:00:16.155218 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="sg-core" containerID="cri-o://51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3" gracePeriod=30 Oct 08 20:00:16 crc kubenswrapper[4750]: I1008 20:00:16.155287 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="proxy-httpd" containerID="cri-o://3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426" gracePeriod=30 Oct 08 20:00:16 crc kubenswrapper[4750]: I1008 20:00:16.155490 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="ceilometer-notification-agent" containerID="cri-o://713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49" gracePeriod=30 Oct 08 20:00:16 crc kubenswrapper[4750]: I1008 20:00:16.800771 4750 generic.go:334] "Generic (PLEG): container finished" podID="984335f4-0d17-4391-947f-ad63864a588d" containerID="51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3" exitCode=2 Oct 08 20:00:16 crc kubenswrapper[4750]: I1008 20:00:16.800845 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerDied","Data":"51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3"} Oct 08 20:00:17 crc kubenswrapper[4750]: I1008 20:00:17.818390 4750 generic.go:334] "Generic (PLEG): container finished" podID="984335f4-0d17-4391-947f-ad63864a588d" containerID="3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426" exitCode=0 Oct 08 20:00:17 crc kubenswrapper[4750]: I1008 20:00:17.818825 4750 generic.go:334] "Generic (PLEG): container finished" podID="984335f4-0d17-4391-947f-ad63864a588d" containerID="f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3" exitCode=0 Oct 08 20:00:17 crc kubenswrapper[4750]: I1008 20:00:17.818470 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerDied","Data":"3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426"} Oct 08 20:00:17 crc kubenswrapper[4750]: I1008 20:00:17.818876 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerDied","Data":"f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3"} Oct 08 20:00:20 crc kubenswrapper[4750]: I1008 20:00:20.858466 4750 generic.go:334] "Generic (PLEG): container finished" podID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerID="0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf" exitCode=0 Oct 08 20:00:20 crc kubenswrapper[4750]: I1008 20:00:20.858588 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54bn2" event={"ID":"f8d9aac5-c222-4925-9b93-08aecfd610fa","Type":"ContainerDied","Data":"0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf"} Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.288743 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.305706 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.444863 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-combined-ca-bundle\") pod \"984335f4-0d17-4391-947f-ad63864a588d\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.445033 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-config-data\") pod \"984335f4-0d17-4391-947f-ad63864a588d\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.445116 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnvxq\" (UniqueName: \"kubernetes.io/projected/984335f4-0d17-4391-947f-ad63864a588d-kube-api-access-hnvxq\") pod \"984335f4-0d17-4391-947f-ad63864a588d\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.445188 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-log-httpd\") pod \"984335f4-0d17-4391-947f-ad63864a588d\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.445248 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-run-httpd\") pod \"984335f4-0d17-4391-947f-ad63864a588d\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.445284 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-sg-core-conf-yaml\") pod \"984335f4-0d17-4391-947f-ad63864a588d\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.445315 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-scripts\") pod \"984335f4-0d17-4391-947f-ad63864a588d\" (UID: \"984335f4-0d17-4391-947f-ad63864a588d\") " Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.445981 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "984335f4-0d17-4391-947f-ad63864a588d" (UID: "984335f4-0d17-4391-947f-ad63864a588d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.446929 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "984335f4-0d17-4391-947f-ad63864a588d" (UID: "984335f4-0d17-4391-947f-ad63864a588d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.453876 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/984335f4-0d17-4391-947f-ad63864a588d-kube-api-access-hnvxq" (OuterVolumeSpecName: "kube-api-access-hnvxq") pod "984335f4-0d17-4391-947f-ad63864a588d" (UID: "984335f4-0d17-4391-947f-ad63864a588d"). InnerVolumeSpecName "kube-api-access-hnvxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.455717 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-scripts" (OuterVolumeSpecName: "scripts") pod "984335f4-0d17-4391-947f-ad63864a588d" (UID: "984335f4-0d17-4391-947f-ad63864a588d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.489680 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "984335f4-0d17-4391-947f-ad63864a588d" (UID: "984335f4-0d17-4391-947f-ad63864a588d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.543425 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "984335f4-0d17-4391-947f-ad63864a588d" (UID: "984335f4-0d17-4391-947f-ad63864a588d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.549066 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnvxq\" (UniqueName: \"kubernetes.io/projected/984335f4-0d17-4391-947f-ad63864a588d-kube-api-access-hnvxq\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.549104 4750 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.549147 4750 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/984335f4-0d17-4391-947f-ad63864a588d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.549159 4750 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.549170 4750 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.549183 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.586483 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-config-data" (OuterVolumeSpecName: "config-data") pod "984335f4-0d17-4391-947f-ad63864a588d" (UID: "984335f4-0d17-4391-947f-ad63864a588d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.651917 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/984335f4-0d17-4391-947f-ad63864a588d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.888852 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54bn2" event={"ID":"f8d9aac5-c222-4925-9b93-08aecfd610fa","Type":"ContainerStarted","Data":"382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1"} Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.898756 4750 generic.go:334] "Generic (PLEG): container finished" podID="984335f4-0d17-4391-947f-ad63864a588d" containerID="713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49" exitCode=0 Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.898825 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerDied","Data":"713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49"} Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.898876 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"984335f4-0d17-4391-947f-ad63864a588d","Type":"ContainerDied","Data":"fa73753f9713902d3ef1965f151cd86733fac45d2a5e8236ed4200f55d333667"} Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.898906 4750 scope.go:117] "RemoveContainer" containerID="3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.899167 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 20:00:21 crc kubenswrapper[4750]: I1008 20:00:21.930206 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-54bn2" podStartSLOduration=5.306549483 podStartE2EDuration="12.930176697s" podCreationTimestamp="2025-10-08 20:00:09 +0000 UTC" firstStartedPulling="2025-10-08 20:00:13.740847922 +0000 UTC m=+6569.653818935" lastFinishedPulling="2025-10-08 20:00:21.364475136 +0000 UTC m=+6577.277446149" observedRunningTime="2025-10-08 20:00:21.917891064 +0000 UTC m=+6577.830862077" watchObservedRunningTime="2025-10-08 20:00:21.930176697 +0000 UTC m=+6577.843147710" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.009773 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.021570 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.050642 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.051228 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="ceilometer-notification-agent" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051252 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="ceilometer-notification-agent" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.051293 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="ceilometer-central-agent" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051300 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="ceilometer-central-agent" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.051309 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" containerName="init" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051315 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" containerName="init" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.051333 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" containerName="dnsmasq-dns" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051340 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" containerName="dnsmasq-dns" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.051354 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="sg-core" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051365 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="sg-core" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.051379 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="proxy-httpd" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051387 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="proxy-httpd" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051648 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="sg-core" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051670 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="proxy-httpd" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051688 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="ceilometer-notification-agent" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051710 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb27f4d8-4d56-4b3d-9240-ab4f5f0ff5f6" containerName="dnsmasq-dns" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.051723 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="984335f4-0d17-4391-947f-ad63864a588d" containerName="ceilometer-central-agent" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.057723 4750 scope.go:117] "RemoveContainer" containerID="51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.059085 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.064350 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.064778 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.072303 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.122169 4750 scope.go:117] "RemoveContainer" containerID="713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.150506 4750 scope.go:117] "RemoveContainer" containerID="f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.172260 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-run-httpd\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.172328 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-scripts\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.172399 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.172431 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-log-httpd\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.172482 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.172590 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxlb5\" (UniqueName: \"kubernetes.io/projected/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-kube-api-access-nxlb5\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.172625 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-config-data\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.175909 4750 scope.go:117] "RemoveContainer" containerID="3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.176654 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426\": container with ID starting with 3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426 not found: ID does not exist" containerID="3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.176695 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426"} err="failed to get container status \"3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426\": rpc error: code = NotFound desc = could not find container \"3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426\": container with ID starting with 3bae262e151a97ec5366da9835e2a1c6a5bdbdb465bc3619ffefd34bf78e9426 not found: ID does not exist" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.176726 4750 scope.go:117] "RemoveContainer" containerID="51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.177924 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3\": container with ID starting with 51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3 not found: ID does not exist" containerID="51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.177979 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3"} err="failed to get container status \"51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3\": rpc error: code = NotFound desc = could not find container \"51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3\": container with ID starting with 51153e661b22af3033fb9e65f42723bf37fc243aa67da47cfb452d7f299d7be3 not found: ID does not exist" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.178019 4750 scope.go:117] "RemoveContainer" containerID="713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.178510 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49\": container with ID starting with 713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49 not found: ID does not exist" containerID="713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.178560 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49"} err="failed to get container status \"713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49\": rpc error: code = NotFound desc = could not find container \"713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49\": container with ID starting with 713a3218dbba5e623009747565c03b9cefa51c9e9befe1dbf96bd60fc03acd49 not found: ID does not exist" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.178584 4750 scope.go:117] "RemoveContainer" containerID="f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3" Oct 08 20:00:22 crc kubenswrapper[4750]: E1008 20:00:22.179018 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3\": container with ID starting with f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3 not found: ID does not exist" containerID="f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.179076 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3"} err="failed to get container status \"f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3\": rpc error: code = NotFound desc = could not find container \"f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3\": container with ID starting with f28dc7724dd16d1fb2c2e4dddfee77cb63e35ebae6fec7c75db3e0bd717960c3 not found: ID does not exist" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.274855 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.274923 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-log-httpd\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.274994 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.275083 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxlb5\" (UniqueName: \"kubernetes.io/projected/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-kube-api-access-nxlb5\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.275115 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-config-data\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.275239 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-run-httpd\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.275272 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-scripts\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.281431 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-run-httpd\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.281824 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-log-httpd\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.294787 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.295204 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.307793 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-config-data\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.309226 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxlb5\" (UniqueName: \"kubernetes.io/projected/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-kube-api-access-nxlb5\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.312701 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5a4198-42a1-4e36-bbe6-22b923bd2a98-scripts\") pod \"ceilometer-0\" (UID: \"ee5a4198-42a1-4e36-bbe6-22b923bd2a98\") " pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.409221 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.753969 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="984335f4-0d17-4391-947f-ad63864a588d" path="/var/lib/kubelet/pods/984335f4-0d17-4391-947f-ad63864a588d/volumes" Oct 08 20:00:22 crc kubenswrapper[4750]: I1008 20:00:22.967984 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 08 20:00:23 crc kubenswrapper[4750]: I1008 20:00:23.035159 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 20:00:23 crc kubenswrapper[4750]: I1008 20:00:23.435947 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 08 20:00:23 crc kubenswrapper[4750]: I1008 20:00:23.957368 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee5a4198-42a1-4e36-bbe6-22b923bd2a98","Type":"ContainerStarted","Data":"f789430bd945a2029bab40d0c95b6b1d84c2ff363b495272f04201415c7d5ac1"} Oct 08 20:00:23 crc kubenswrapper[4750]: I1008 20:00:23.957809 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee5a4198-42a1-4e36-bbe6-22b923bd2a98","Type":"ContainerStarted","Data":"ae2d579356df4e79a37ebbc3b2b96a9d3d693326be8f512ba497455cd8e13a04"} Oct 08 20:00:24 crc kubenswrapper[4750]: I1008 20:00:24.974028 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee5a4198-42a1-4e36-bbe6-22b923bd2a98","Type":"ContainerStarted","Data":"c07e31550ed966448bb3e6fdd3a45f1d6b9fa286d035d564525834578d034681"} Oct 08 20:00:25 crc kubenswrapper[4750]: I1008 20:00:25.990292 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee5a4198-42a1-4e36-bbe6-22b923bd2a98","Type":"ContainerStarted","Data":"9e12914ccb11b79671ab7fb8d1694848447fd670e847d773f6973aa46270606c"} Oct 08 20:00:26 crc kubenswrapper[4750]: I1008 20:00:26.735118 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:00:26 crc kubenswrapper[4750]: E1008 20:00:26.735870 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:00:29 crc kubenswrapper[4750]: I1008 20:00:29.043980 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee5a4198-42a1-4e36-bbe6-22b923bd2a98","Type":"ContainerStarted","Data":"1027afc16551697b5392b2619cb5bd4a2a6086d0cdd2c6ae8c4c396d4d6a3653"} Oct 08 20:00:29 crc kubenswrapper[4750]: I1008 20:00:29.045156 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 20:00:29 crc kubenswrapper[4750]: I1008 20:00:29.077830 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.097874378 podStartE2EDuration="8.07781151s" podCreationTimestamp="2025-10-08 20:00:21 +0000 UTC" firstStartedPulling="2025-10-08 20:00:23.057502959 +0000 UTC m=+6578.970473972" lastFinishedPulling="2025-10-08 20:00:28.037440091 +0000 UTC m=+6583.950411104" observedRunningTime="2025-10-08 20:00:29.072191821 +0000 UTC m=+6584.985162834" watchObservedRunningTime="2025-10-08 20:00:29.07781151 +0000 UTC m=+6584.990782523" Oct 08 20:00:29 crc kubenswrapper[4750]: I1008 20:00:29.425876 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:29 crc kubenswrapper[4750]: I1008 20:00:29.428624 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:30 crc kubenswrapper[4750]: I1008 20:00:30.495927 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-54bn2" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="registry-server" probeResult="failure" output=< Oct 08 20:00:30 crc kubenswrapper[4750]: timeout: failed to connect service ":50051" within 1s Oct 08 20:00:30 crc kubenswrapper[4750]: > Oct 08 20:00:33 crc kubenswrapper[4750]: I1008 20:00:33.158339 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 08 20:00:38 crc kubenswrapper[4750]: I1008 20:00:38.734815 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:00:38 crc kubenswrapper[4750]: E1008 20:00:38.738751 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:00:40 crc kubenswrapper[4750]: I1008 20:00:40.488219 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-54bn2" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="registry-server" probeResult="failure" output=< Oct 08 20:00:40 crc kubenswrapper[4750]: timeout: failed to connect service ":50051" within 1s Oct 08 20:00:40 crc kubenswrapper[4750]: > Oct 08 20:00:49 crc kubenswrapper[4750]: I1008 20:00:49.516660 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:49 crc kubenswrapper[4750]: I1008 20:00:49.580343 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:49 crc kubenswrapper[4750]: I1008 20:00:49.734633 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:00:49 crc kubenswrapper[4750]: E1008 20:00:49.734970 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:00:50 crc kubenswrapper[4750]: I1008 20:00:50.710820 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-54bn2"] Oct 08 20:00:51 crc kubenswrapper[4750]: I1008 20:00:51.066504 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-lhk2f"] Oct 08 20:00:51 crc kubenswrapper[4750]: I1008 20:00:51.078052 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-lhk2f"] Oct 08 20:00:51 crc kubenswrapper[4750]: I1008 20:00:51.369463 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-54bn2" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="registry-server" containerID="cri-o://382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1" gracePeriod=2 Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.003213 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.109817 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-utilities\") pod \"f8d9aac5-c222-4925-9b93-08aecfd610fa\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.110004 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78w6k\" (UniqueName: \"kubernetes.io/projected/f8d9aac5-c222-4925-9b93-08aecfd610fa-kube-api-access-78w6k\") pod \"f8d9aac5-c222-4925-9b93-08aecfd610fa\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.110131 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-catalog-content\") pod \"f8d9aac5-c222-4925-9b93-08aecfd610fa\" (UID: \"f8d9aac5-c222-4925-9b93-08aecfd610fa\") " Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.110671 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-utilities" (OuterVolumeSpecName: "utilities") pod "f8d9aac5-c222-4925-9b93-08aecfd610fa" (UID: "f8d9aac5-c222-4925-9b93-08aecfd610fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.111458 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.121857 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d9aac5-c222-4925-9b93-08aecfd610fa-kube-api-access-78w6k" (OuterVolumeSpecName: "kube-api-access-78w6k") pod "f8d9aac5-c222-4925-9b93-08aecfd610fa" (UID: "f8d9aac5-c222-4925-9b93-08aecfd610fa"). InnerVolumeSpecName "kube-api-access-78w6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.203578 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8d9aac5-c222-4925-9b93-08aecfd610fa" (UID: "f8d9aac5-c222-4925-9b93-08aecfd610fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.214608 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78w6k\" (UniqueName: \"kubernetes.io/projected/f8d9aac5-c222-4925-9b93-08aecfd610fa-kube-api-access-78w6k\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.214655 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d9aac5-c222-4925-9b93-08aecfd610fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.424303 4750 generic.go:334] "Generic (PLEG): container finished" podID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerID="382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1" exitCode=0 Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.424382 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54bn2" event={"ID":"f8d9aac5-c222-4925-9b93-08aecfd610fa","Type":"ContainerDied","Data":"382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1"} Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.424483 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-54bn2" event={"ID":"f8d9aac5-c222-4925-9b93-08aecfd610fa","Type":"ContainerDied","Data":"395f16c4b23f1a2508333b9178ebc29c764a16d987692d08c800016c14b344b0"} Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.424521 4750 scope.go:117] "RemoveContainer" containerID="382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.424535 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-54bn2" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.432233 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.480149 4750 scope.go:117] "RemoveContainer" containerID="0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.514582 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-54bn2"] Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.527631 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-54bn2"] Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.544868 4750 scope.go:117] "RemoveContainer" containerID="a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.570188 4750 scope.go:117] "RemoveContainer" containerID="382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1" Oct 08 20:00:52 crc kubenswrapper[4750]: E1008 20:00:52.570739 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1\": container with ID starting with 382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1 not found: ID does not exist" containerID="382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.570804 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1"} err="failed to get container status \"382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1\": rpc error: code = NotFound desc = could not find container \"382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1\": container with ID starting with 382e0c0fc8dca80461503c0241db2f77ebcdde67c8ba1d2cf0dd50c03358e4d1 not found: ID does not exist" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.570846 4750 scope.go:117] "RemoveContainer" containerID="0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf" Oct 08 20:00:52 crc kubenswrapper[4750]: E1008 20:00:52.571222 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf\": container with ID starting with 0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf not found: ID does not exist" containerID="0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.571252 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf"} err="failed to get container status \"0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf\": rpc error: code = NotFound desc = could not find container \"0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf\": container with ID starting with 0dcfce2a2c26d0b9fa3e1047249ebb00032b51c74358bc862fe1002ecc499ebf not found: ID does not exist" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.571271 4750 scope.go:117] "RemoveContainer" containerID="a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544" Oct 08 20:00:52 crc kubenswrapper[4750]: E1008 20:00:52.571867 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544\": container with ID starting with a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544 not found: ID does not exist" containerID="a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.571900 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544"} err="failed to get container status \"a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544\": rpc error: code = NotFound desc = could not find container \"a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544\": container with ID starting with a22c6bdb8e3760dd3e4aa9ab4ae630eaa64405908b7b5b89501fd2aace23d544 not found: ID does not exist" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.749184 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b6efbc-634f-4414-ac4f-dbd04a75c31e" path="/var/lib/kubelet/pods/90b6efbc-634f-4414-ac4f-dbd04a75c31e/volumes" Oct 08 20:00:52 crc kubenswrapper[4750]: I1008 20:00:52.749895 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" path="/var/lib/kubelet/pods/f8d9aac5-c222-4925-9b93-08aecfd610fa/volumes" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.187440 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29332561-9dcgq"] Oct 08 20:01:00 crc kubenswrapper[4750]: E1008 20:01:00.189254 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="extract-content" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.189280 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="extract-content" Oct 08 20:01:00 crc kubenswrapper[4750]: E1008 20:01:00.189352 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="extract-utilities" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.189397 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="extract-utilities" Oct 08 20:01:00 crc kubenswrapper[4750]: E1008 20:01:00.189426 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="registry-server" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.189437 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="registry-server" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.189854 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d9aac5-c222-4925-9b93-08aecfd610fa" containerName="registry-server" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.191357 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.204031 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29332561-9dcgq"] Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.240757 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-config-data\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.240823 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-combined-ca-bundle\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.241017 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxr52\" (UniqueName: \"kubernetes.io/projected/03421683-cf9d-4dcd-ba62-cbe1da2dca16-kube-api-access-jxr52\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.241089 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-fernet-keys\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.344028 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-fernet-keys\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.344150 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-config-data\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.344209 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-combined-ca-bundle\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.344409 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxr52\" (UniqueName: \"kubernetes.io/projected/03421683-cf9d-4dcd-ba62-cbe1da2dca16-kube-api-access-jxr52\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.353597 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-config-data\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.353615 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-fernet-keys\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.353973 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-combined-ca-bundle\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.367269 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxr52\" (UniqueName: \"kubernetes.io/projected/03421683-cf9d-4dcd-ba62-cbe1da2dca16-kube-api-access-jxr52\") pod \"keystone-cron-29332561-9dcgq\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:00 crc kubenswrapper[4750]: I1008 20:01:00.529717 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:01 crc kubenswrapper[4750]: I1008 20:01:01.062025 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29332561-9dcgq"] Oct 08 20:01:01 crc kubenswrapper[4750]: I1008 20:01:01.540289 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332561-9dcgq" event={"ID":"03421683-cf9d-4dcd-ba62-cbe1da2dca16","Type":"ContainerStarted","Data":"9c67c09935358aa4cd5d853160c6c26fab16aaa17643a02532bafac72f31f1ee"} Oct 08 20:01:01 crc kubenswrapper[4750]: I1008 20:01:01.540846 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332561-9dcgq" event={"ID":"03421683-cf9d-4dcd-ba62-cbe1da2dca16","Type":"ContainerStarted","Data":"c48980e2ab08c55fd2f730c7959d116ad4a5242fc95b37808212772ec3f229ac"} Oct 08 20:01:01 crc kubenswrapper[4750]: I1008 20:01:01.568304 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29332561-9dcgq" podStartSLOduration=1.568277795 podStartE2EDuration="1.568277795s" podCreationTimestamp="2025-10-08 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 20:01:01.561775235 +0000 UTC m=+6617.474746278" watchObservedRunningTime="2025-10-08 20:01:01.568277795 +0000 UTC m=+6617.481248828" Oct 08 20:01:02 crc kubenswrapper[4750]: I1008 20:01:02.060925 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-9f34-account-create-l8vfk"] Oct 08 20:01:02 crc kubenswrapper[4750]: I1008 20:01:02.072107 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-9f34-account-create-l8vfk"] Oct 08 20:01:02 crc kubenswrapper[4750]: I1008 20:01:02.740252 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:01:02 crc kubenswrapper[4750]: E1008 20:01:02.741057 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:01:02 crc kubenswrapper[4750]: I1008 20:01:02.757464 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a4a453-d3de-47f6-84e7-176daf84a18a" path="/var/lib/kubelet/pods/00a4a453-d3de-47f6-84e7-176daf84a18a/volumes" Oct 08 20:01:04 crc kubenswrapper[4750]: I1008 20:01:04.615129 4750 generic.go:334] "Generic (PLEG): container finished" podID="03421683-cf9d-4dcd-ba62-cbe1da2dca16" containerID="9c67c09935358aa4cd5d853160c6c26fab16aaa17643a02532bafac72f31f1ee" exitCode=0 Oct 08 20:01:04 crc kubenswrapper[4750]: I1008 20:01:04.615204 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332561-9dcgq" event={"ID":"03421683-cf9d-4dcd-ba62-cbe1da2dca16","Type":"ContainerDied","Data":"9c67c09935358aa4cd5d853160c6c26fab16aaa17643a02532bafac72f31f1ee"} Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.080229 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.101244 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-fernet-keys\") pod \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.101332 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxr52\" (UniqueName: \"kubernetes.io/projected/03421683-cf9d-4dcd-ba62-cbe1da2dca16-kube-api-access-jxr52\") pod \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.101411 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-config-data\") pod \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.101579 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-combined-ca-bundle\") pod \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\" (UID: \"03421683-cf9d-4dcd-ba62-cbe1da2dca16\") " Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.111085 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "03421683-cf9d-4dcd-ba62-cbe1da2dca16" (UID: "03421683-cf9d-4dcd-ba62-cbe1da2dca16"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.112927 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03421683-cf9d-4dcd-ba62-cbe1da2dca16-kube-api-access-jxr52" (OuterVolumeSpecName: "kube-api-access-jxr52") pod "03421683-cf9d-4dcd-ba62-cbe1da2dca16" (UID: "03421683-cf9d-4dcd-ba62-cbe1da2dca16"). InnerVolumeSpecName "kube-api-access-jxr52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.154039 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03421683-cf9d-4dcd-ba62-cbe1da2dca16" (UID: "03421683-cf9d-4dcd-ba62-cbe1da2dca16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.164114 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-config-data" (OuterVolumeSpecName: "config-data") pod "03421683-cf9d-4dcd-ba62-cbe1da2dca16" (UID: "03421683-cf9d-4dcd-ba62-cbe1da2dca16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.204153 4750 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.204183 4750 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.204196 4750 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03421683-cf9d-4dcd-ba62-cbe1da2dca16-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.204208 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxr52\" (UniqueName: \"kubernetes.io/projected/03421683-cf9d-4dcd-ba62-cbe1da2dca16-kube-api-access-jxr52\") on node \"crc\" DevicePath \"\"" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.641417 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332561-9dcgq" event={"ID":"03421683-cf9d-4dcd-ba62-cbe1da2dca16","Type":"ContainerDied","Data":"c48980e2ab08c55fd2f730c7959d116ad4a5242fc95b37808212772ec3f229ac"} Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.641884 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c48980e2ab08c55fd2f730c7959d116ad4a5242fc95b37808212772ec3f229ac" Oct 08 20:01:06 crc kubenswrapper[4750]: I1008 20:01:06.641510 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332561-9dcgq" Oct 08 20:01:09 crc kubenswrapper[4750]: I1008 20:01:09.034858 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-jnvxp"] Oct 08 20:01:09 crc kubenswrapper[4750]: I1008 20:01:09.046768 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-jnvxp"] Oct 08 20:01:10 crc kubenswrapper[4750]: I1008 20:01:10.749909 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d531061-ecfd-4e0b-b703-4354517d7cec" path="/var/lib/kubelet/pods/1d531061-ecfd-4e0b-b703-4354517d7cec/volumes" Oct 08 20:01:12 crc kubenswrapper[4750]: I1008 20:01:12.277161 4750 scope.go:117] "RemoveContainer" containerID="c6ea7bf1e99e5d9a28419c90530b17bcce11497286fda1e5026545664bc288fd" Oct 08 20:01:12 crc kubenswrapper[4750]: I1008 20:01:12.337931 4750 scope.go:117] "RemoveContainer" containerID="ceecb5fabfb571fbbeded2c97cc653ed3571f4783fa7da8ab8dd63a1fddb003f" Oct 08 20:01:12 crc kubenswrapper[4750]: I1008 20:01:12.622625 4750 scope.go:117] "RemoveContainer" containerID="d93d5c7d187b7a4aa045a71dbb2e45491d77cb51aa51eec6d30c58e4e64ff9b2" Oct 08 20:01:12 crc kubenswrapper[4750]: I1008 20:01:12.675731 4750 scope.go:117] "RemoveContainer" containerID="e49432ff7a9987b4e272e261789289b911e409afb19f4a12c4e8c3016d46be67" Oct 08 20:01:12 crc kubenswrapper[4750]: I1008 20:01:12.756350 4750 scope.go:117] "RemoveContainer" containerID="87ed84e207c8ee0176b27ca029277a7c433695fb11ba21ab9909d99f324072bb" Oct 08 20:01:14 crc kubenswrapper[4750]: I1008 20:01:14.753308 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:01:14 crc kubenswrapper[4750]: E1008 20:01:14.754582 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:01:20 crc kubenswrapper[4750]: I1008 20:01:20.051936 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-cf34-account-create-nbmf6"] Oct 08 20:01:20 crc kubenswrapper[4750]: I1008 20:01:20.068782 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-cf34-account-create-nbmf6"] Oct 08 20:01:20 crc kubenswrapper[4750]: I1008 20:01:20.749531 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c54a1b6-2888-48e7-86d3-7e25103e7a6e" path="/var/lib/kubelet/pods/7c54a1b6-2888-48e7-86d3-7e25103e7a6e/volumes" Oct 08 20:01:25 crc kubenswrapper[4750]: I1008 20:01:25.735625 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:01:25 crc kubenswrapper[4750]: E1008 20:01:25.736940 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:01:40 crc kubenswrapper[4750]: I1008 20:01:40.735751 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:01:40 crc kubenswrapper[4750]: E1008 20:01:40.736884 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:01:55 crc kubenswrapper[4750]: I1008 20:01:55.737117 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:01:55 crc kubenswrapper[4750]: E1008 20:01:55.738235 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:02:05 crc kubenswrapper[4750]: I1008 20:02:05.049577 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-gj8s8"] Oct 08 20:02:05 crc kubenswrapper[4750]: I1008 20:02:05.071282 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-gj8s8"] Oct 08 20:02:06 crc kubenswrapper[4750]: I1008 20:02:06.750479 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707bf9bd-f37a-4af7-b6b0-e661956d4945" path="/var/lib/kubelet/pods/707bf9bd-f37a-4af7-b6b0-e661956d4945/volumes" Oct 08 20:02:07 crc kubenswrapper[4750]: I1008 20:02:07.735612 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:02:07 crc kubenswrapper[4750]: E1008 20:02:07.736437 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:02:13 crc kubenswrapper[4750]: I1008 20:02:13.012500 4750 scope.go:117] "RemoveContainer" containerID="f784d98fd6552e2a9d4b038c68601cfacf68577ae7977139543ace5cc72c8065" Oct 08 20:02:13 crc kubenswrapper[4750]: I1008 20:02:13.070716 4750 scope.go:117] "RemoveContainer" containerID="6b3c4c24f9072637cf10b61ba6d51475516c22664627b10151f4ffc18dab5dce" Oct 08 20:02:13 crc kubenswrapper[4750]: I1008 20:02:13.127448 4750 scope.go:117] "RemoveContainer" containerID="3cb41901116cdcdba3834086a11522d7fbbcffc8d56127e42dbf38ee4bf50e09" Oct 08 20:02:20 crc kubenswrapper[4750]: I1008 20:02:20.734323 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:02:20 crc kubenswrapper[4750]: E1008 20:02:20.735689 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:02:31 crc kubenswrapper[4750]: I1008 20:02:31.734263 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:02:31 crc kubenswrapper[4750]: E1008 20:02:31.735208 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:02:42 crc kubenswrapper[4750]: I1008 20:02:42.740180 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:02:42 crc kubenswrapper[4750]: E1008 20:02:42.741263 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:02:54 crc kubenswrapper[4750]: I1008 20:02:54.743393 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:02:54 crc kubenswrapper[4750]: E1008 20:02:54.744513 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:03:09 crc kubenswrapper[4750]: I1008 20:03:09.734686 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:03:09 crc kubenswrapper[4750]: E1008 20:03:09.735779 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:03:23 crc kubenswrapper[4750]: I1008 20:03:23.734931 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:03:23 crc kubenswrapper[4750]: E1008 20:03:23.736528 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:03:34 crc kubenswrapper[4750]: I1008 20:03:34.832807 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k48jf"] Oct 08 20:03:34 crc kubenswrapper[4750]: E1008 20:03:34.835535 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03421683-cf9d-4dcd-ba62-cbe1da2dca16" containerName="keystone-cron" Oct 08 20:03:34 crc kubenswrapper[4750]: I1008 20:03:34.835664 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="03421683-cf9d-4dcd-ba62-cbe1da2dca16" containerName="keystone-cron" Oct 08 20:03:34 crc kubenswrapper[4750]: I1008 20:03:34.836016 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="03421683-cf9d-4dcd-ba62-cbe1da2dca16" containerName="keystone-cron" Oct 08 20:03:34 crc kubenswrapper[4750]: I1008 20:03:34.837911 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:34 crc kubenswrapper[4750]: I1008 20:03:34.881762 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k48jf"] Oct 08 20:03:34 crc kubenswrapper[4750]: I1008 20:03:34.935495 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg8mp\" (UniqueName: \"kubernetes.io/projected/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-kube-api-access-bg8mp\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:34 crc kubenswrapper[4750]: I1008 20:03:34.936057 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-catalog-content\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:34 crc kubenswrapper[4750]: I1008 20:03:34.936238 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-utilities\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:35 crc kubenswrapper[4750]: I1008 20:03:35.039260 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-catalog-content\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:35 crc kubenswrapper[4750]: I1008 20:03:35.039430 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-utilities\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:35 crc kubenswrapper[4750]: I1008 20:03:35.039482 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg8mp\" (UniqueName: \"kubernetes.io/projected/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-kube-api-access-bg8mp\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:35 crc kubenswrapper[4750]: I1008 20:03:35.040358 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-catalog-content\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:35 crc kubenswrapper[4750]: I1008 20:03:35.040487 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-utilities\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:35 crc kubenswrapper[4750]: I1008 20:03:35.071619 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg8mp\" (UniqueName: \"kubernetes.io/projected/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-kube-api-access-bg8mp\") pod \"certified-operators-k48jf\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:35 crc kubenswrapper[4750]: I1008 20:03:35.174921 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:35 crc kubenswrapper[4750]: I1008 20:03:35.701099 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k48jf"] Oct 08 20:03:35 crc kubenswrapper[4750]: W1008 20:03:35.707396 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc305aa42_ff27_4ec5_8ac1_ad44b1d385d0.slice/crio-395a9018de4b9ede576dd9d973862c25114251cf634932a9820d22cbcbd77b27 WatchSource:0}: Error finding container 395a9018de4b9ede576dd9d973862c25114251cf634932a9820d22cbcbd77b27: Status 404 returned error can't find the container with id 395a9018de4b9ede576dd9d973862c25114251cf634932a9820d22cbcbd77b27 Oct 08 20:03:36 crc kubenswrapper[4750]: I1008 20:03:36.704511 4750 generic.go:334] "Generic (PLEG): container finished" podID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerID="8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27" exitCode=0 Oct 08 20:03:36 crc kubenswrapper[4750]: I1008 20:03:36.704590 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48jf" event={"ID":"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0","Type":"ContainerDied","Data":"8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27"} Oct 08 20:03:36 crc kubenswrapper[4750]: I1008 20:03:36.704916 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48jf" event={"ID":"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0","Type":"ContainerStarted","Data":"395a9018de4b9ede576dd9d973862c25114251cf634932a9820d22cbcbd77b27"} Oct 08 20:03:37 crc kubenswrapper[4750]: I1008 20:03:37.735242 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:03:37 crc kubenswrapper[4750]: E1008 20:03:37.735934 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:03:38 crc kubenswrapper[4750]: I1008 20:03:38.732255 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48jf" event={"ID":"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0","Type":"ContainerStarted","Data":"a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835"} Oct 08 20:03:39 crc kubenswrapper[4750]: I1008 20:03:39.752027 4750 generic.go:334] "Generic (PLEG): container finished" podID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerID="a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835" exitCode=0 Oct 08 20:03:39 crc kubenswrapper[4750]: I1008 20:03:39.752150 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48jf" event={"ID":"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0","Type":"ContainerDied","Data":"a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835"} Oct 08 20:03:39 crc kubenswrapper[4750]: E1008 20:03:39.792262 4750 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc305aa42_ff27_4ec5_8ac1_ad44b1d385d0.slice/crio-conmon-a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc305aa42_ff27_4ec5_8ac1_ad44b1d385d0.slice/crio-a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835.scope\": RecentStats: unable to find data in memory cache]" Oct 08 20:03:40 crc kubenswrapper[4750]: I1008 20:03:40.768455 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48jf" event={"ID":"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0","Type":"ContainerStarted","Data":"8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb"} Oct 08 20:03:40 crc kubenswrapper[4750]: I1008 20:03:40.801331 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k48jf" podStartSLOduration=3.233605167 podStartE2EDuration="6.801302617s" podCreationTimestamp="2025-10-08 20:03:34 +0000 UTC" firstStartedPulling="2025-10-08 20:03:36.707524026 +0000 UTC m=+6772.620495079" lastFinishedPulling="2025-10-08 20:03:40.275221486 +0000 UTC m=+6776.188192529" observedRunningTime="2025-10-08 20:03:40.793075413 +0000 UTC m=+6776.706046436" watchObservedRunningTime="2025-10-08 20:03:40.801302617 +0000 UTC m=+6776.714273630" Oct 08 20:03:45 crc kubenswrapper[4750]: I1008 20:03:45.175748 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:45 crc kubenswrapper[4750]: I1008 20:03:45.176670 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:45 crc kubenswrapper[4750]: I1008 20:03:45.273034 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:45 crc kubenswrapper[4750]: I1008 20:03:45.916689 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:45 crc kubenswrapper[4750]: I1008 20:03:45.988787 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k48jf"] Oct 08 20:03:47 crc kubenswrapper[4750]: I1008 20:03:47.872088 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k48jf" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerName="registry-server" containerID="cri-o://8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb" gracePeriod=2 Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.516565 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.627116 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg8mp\" (UniqueName: \"kubernetes.io/projected/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-kube-api-access-bg8mp\") pod \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.627269 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-catalog-content\") pod \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.627460 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-utilities\") pod \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\" (UID: \"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0\") " Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.628514 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-utilities" (OuterVolumeSpecName: "utilities") pod "c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" (UID: "c305aa42-ff27-4ec5-8ac1-ad44b1d385d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.638244 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-kube-api-access-bg8mp" (OuterVolumeSpecName: "kube-api-access-bg8mp") pod "c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" (UID: "c305aa42-ff27-4ec5-8ac1-ad44b1d385d0"). InnerVolumeSpecName "kube-api-access-bg8mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.692514 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" (UID: "c305aa42-ff27-4ec5-8ac1-ad44b1d385d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.730720 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.730783 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg8mp\" (UniqueName: \"kubernetes.io/projected/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-kube-api-access-bg8mp\") on node \"crc\" DevicePath \"\"" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.730802 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.884806 4750 generic.go:334] "Generic (PLEG): container finished" podID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerID="8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb" exitCode=0 Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.884858 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48jf" event={"ID":"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0","Type":"ContainerDied","Data":"8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb"} Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.884906 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k48jf" event={"ID":"c305aa42-ff27-4ec5-8ac1-ad44b1d385d0","Type":"ContainerDied","Data":"395a9018de4b9ede576dd9d973862c25114251cf634932a9820d22cbcbd77b27"} Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.884899 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k48jf" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.884928 4750 scope.go:117] "RemoveContainer" containerID="8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.922823 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k48jf"] Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.923496 4750 scope.go:117] "RemoveContainer" containerID="a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835" Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.928856 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k48jf"] Oct 08 20:03:48 crc kubenswrapper[4750]: I1008 20:03:48.951586 4750 scope.go:117] "RemoveContainer" containerID="8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27" Oct 08 20:03:49 crc kubenswrapper[4750]: I1008 20:03:49.016345 4750 scope.go:117] "RemoveContainer" containerID="8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb" Oct 08 20:03:49 crc kubenswrapper[4750]: E1008 20:03:49.017154 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb\": container with ID starting with 8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb not found: ID does not exist" containerID="8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb" Oct 08 20:03:49 crc kubenswrapper[4750]: I1008 20:03:49.017190 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb"} err="failed to get container status \"8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb\": rpc error: code = NotFound desc = could not find container \"8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb\": container with ID starting with 8a63e88b3bf5d88bb1712940ec7743fea64fd7b8ae0ef63b40d42d8c7852a2cb not found: ID does not exist" Oct 08 20:03:49 crc kubenswrapper[4750]: I1008 20:03:49.017222 4750 scope.go:117] "RemoveContainer" containerID="a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835" Oct 08 20:03:49 crc kubenswrapper[4750]: E1008 20:03:49.017643 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835\": container with ID starting with a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835 not found: ID does not exist" containerID="a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835" Oct 08 20:03:49 crc kubenswrapper[4750]: I1008 20:03:49.017662 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835"} err="failed to get container status \"a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835\": rpc error: code = NotFound desc = could not find container \"a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835\": container with ID starting with a2f3cc82922d98445e35622d740a7255c4df49436cb59d4b7773d99d84e41835 not found: ID does not exist" Oct 08 20:03:49 crc kubenswrapper[4750]: I1008 20:03:49.017675 4750 scope.go:117] "RemoveContainer" containerID="8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27" Oct 08 20:03:49 crc kubenswrapper[4750]: E1008 20:03:49.018307 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27\": container with ID starting with 8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27 not found: ID does not exist" containerID="8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27" Oct 08 20:03:49 crc kubenswrapper[4750]: I1008 20:03:49.018362 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27"} err="failed to get container status \"8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27\": rpc error: code = NotFound desc = could not find container \"8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27\": container with ID starting with 8c5355f5e170cfb6a3ff3bc1d792510f35490cb41dd497f65f0f52c3d6dadc27 not found: ID does not exist" Oct 08 20:03:50 crc kubenswrapper[4750]: I1008 20:03:50.764435 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" path="/var/lib/kubelet/pods/c305aa42-ff27-4ec5-8ac1-ad44b1d385d0/volumes" Oct 08 20:03:52 crc kubenswrapper[4750]: I1008 20:03:52.736028 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:03:52 crc kubenswrapper[4750]: E1008 20:03:52.737032 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.586063 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7b8x6/must-gather-2jh45"] Oct 08 20:03:54 crc kubenswrapper[4750]: E1008 20:03:54.587133 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerName="extract-utilities" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.587148 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerName="extract-utilities" Oct 08 20:03:54 crc kubenswrapper[4750]: E1008 20:03:54.587176 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerName="registry-server" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.587183 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerName="registry-server" Oct 08 20:03:54 crc kubenswrapper[4750]: E1008 20:03:54.587201 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerName="extract-content" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.587209 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerName="extract-content" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.587488 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="c305aa42-ff27-4ec5-8ac1-ad44b1d385d0" containerName="registry-server" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.588972 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.592237 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7b8x6"/"openshift-service-ca.crt" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.592299 4750 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7b8x6"/"default-dockercfg-n4lkc" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.597305 4750 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7b8x6"/"kube-root-ca.crt" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.599494 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7b8x6/must-gather-2jh45"] Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.713657 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63e5a76a-91eb-4254-a425-e63278761e8f-must-gather-output\") pod \"must-gather-2jh45\" (UID: \"63e5a76a-91eb-4254-a425-e63278761e8f\") " pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.713845 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9qbp\" (UniqueName: \"kubernetes.io/projected/63e5a76a-91eb-4254-a425-e63278761e8f-kube-api-access-d9qbp\") pod \"must-gather-2jh45\" (UID: \"63e5a76a-91eb-4254-a425-e63278761e8f\") " pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.816207 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63e5a76a-91eb-4254-a425-e63278761e8f-must-gather-output\") pod \"must-gather-2jh45\" (UID: \"63e5a76a-91eb-4254-a425-e63278761e8f\") " pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.816344 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9qbp\" (UniqueName: \"kubernetes.io/projected/63e5a76a-91eb-4254-a425-e63278761e8f-kube-api-access-d9qbp\") pod \"must-gather-2jh45\" (UID: \"63e5a76a-91eb-4254-a425-e63278761e8f\") " pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.816751 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63e5a76a-91eb-4254-a425-e63278761e8f-must-gather-output\") pod \"must-gather-2jh45\" (UID: \"63e5a76a-91eb-4254-a425-e63278761e8f\") " pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.836274 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9qbp\" (UniqueName: \"kubernetes.io/projected/63e5a76a-91eb-4254-a425-e63278761e8f-kube-api-access-d9qbp\") pod \"must-gather-2jh45\" (UID: \"63e5a76a-91eb-4254-a425-e63278761e8f\") " pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:03:54 crc kubenswrapper[4750]: I1008 20:03:54.916048 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:03:55 crc kubenswrapper[4750]: I1008 20:03:55.474045 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7b8x6/must-gather-2jh45"] Oct 08 20:03:55 crc kubenswrapper[4750]: I1008 20:03:55.978322 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/must-gather-2jh45" event={"ID":"63e5a76a-91eb-4254-a425-e63278761e8f","Type":"ContainerStarted","Data":"6af10d08496dfc9fa8691d695b50cc4423a879d2606088caa5b70ecba6c22b9e"} Oct 08 20:04:04 crc kubenswrapper[4750]: I1008 20:04:04.079732 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/must-gather-2jh45" event={"ID":"63e5a76a-91eb-4254-a425-e63278761e8f","Type":"ContainerStarted","Data":"0b3969d71faf0c111aadeb208dcb5596af4fa914b806234105ee4892b1f79bf2"} Oct 08 20:04:04 crc kubenswrapper[4750]: I1008 20:04:04.080680 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/must-gather-2jh45" event={"ID":"63e5a76a-91eb-4254-a425-e63278761e8f","Type":"ContainerStarted","Data":"6856bfc51be89118b3c27c6cdde39f205e687f54c29ac12dc06635a6498f67bf"} Oct 08 20:04:04 crc kubenswrapper[4750]: I1008 20:04:04.109393 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7b8x6/must-gather-2jh45" podStartSLOduration=2.429251878 podStartE2EDuration="10.109372861s" podCreationTimestamp="2025-10-08 20:03:54 +0000 UTC" firstStartedPulling="2025-10-08 20:03:55.481523752 +0000 UTC m=+6791.394494765" lastFinishedPulling="2025-10-08 20:04:03.161644735 +0000 UTC m=+6799.074615748" observedRunningTime="2025-10-08 20:04:04.107226778 +0000 UTC m=+6800.020197811" watchObservedRunningTime="2025-10-08 20:04:04.109372861 +0000 UTC m=+6800.022343884" Oct 08 20:04:06 crc kubenswrapper[4750]: I1008 20:04:06.735660 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:04:06 crc kubenswrapper[4750]: E1008 20:04:06.736718 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.388484 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-8z98f"] Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.391905 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.491224 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9d7\" (UniqueName: \"kubernetes.io/projected/f6bf367e-25de-45d2-93b9-193b5e86dd86-kube-api-access-4c9d7\") pod \"crc-debug-8z98f\" (UID: \"f6bf367e-25de-45d2-93b9-193b5e86dd86\") " pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.491757 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6bf367e-25de-45d2-93b9-193b5e86dd86-host\") pod \"crc-debug-8z98f\" (UID: \"f6bf367e-25de-45d2-93b9-193b5e86dd86\") " pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.599483 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9d7\" (UniqueName: \"kubernetes.io/projected/f6bf367e-25de-45d2-93b9-193b5e86dd86-kube-api-access-4c9d7\") pod \"crc-debug-8z98f\" (UID: \"f6bf367e-25de-45d2-93b9-193b5e86dd86\") " pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.599623 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6bf367e-25de-45d2-93b9-193b5e86dd86-host\") pod \"crc-debug-8z98f\" (UID: \"f6bf367e-25de-45d2-93b9-193b5e86dd86\") " pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.600071 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6bf367e-25de-45d2-93b9-193b5e86dd86-host\") pod \"crc-debug-8z98f\" (UID: \"f6bf367e-25de-45d2-93b9-193b5e86dd86\") " pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.623070 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9d7\" (UniqueName: \"kubernetes.io/projected/f6bf367e-25de-45d2-93b9-193b5e86dd86-kube-api-access-4c9d7\") pod \"crc-debug-8z98f\" (UID: \"f6bf367e-25de-45d2-93b9-193b5e86dd86\") " pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:04:08 crc kubenswrapper[4750]: I1008 20:04:08.723610 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:04:09 crc kubenswrapper[4750]: I1008 20:04:09.134634 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/crc-debug-8z98f" event={"ID":"f6bf367e-25de-45d2-93b9-193b5e86dd86","Type":"ContainerStarted","Data":"14bda30ee57a64923345ec039bd10e76e7b2cb3914ad4902d3e99ffeb0a1dff1"} Oct 08 20:04:17 crc kubenswrapper[4750]: I1008 20:04:17.734783 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:04:17 crc kubenswrapper[4750]: E1008 20:04:17.736002 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:04:25 crc kubenswrapper[4750]: I1008 20:04:25.352862 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/crc-debug-8z98f" event={"ID":"f6bf367e-25de-45d2-93b9-193b5e86dd86","Type":"ContainerStarted","Data":"7938dff28d88f3e13eb96b129604ef9fac52728305d3e5b6bd9d3e3e15006fdd"} Oct 08 20:04:25 crc kubenswrapper[4750]: I1008 20:04:25.392449 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7b8x6/crc-debug-8z98f" podStartSLOduration=1.642491802 podStartE2EDuration="17.392341412s" podCreationTimestamp="2025-10-08 20:04:08 +0000 UTC" firstStartedPulling="2025-10-08 20:04:08.773029225 +0000 UTC m=+6804.686000238" lastFinishedPulling="2025-10-08 20:04:24.522878835 +0000 UTC m=+6820.435849848" observedRunningTime="2025-10-08 20:04:25.375684479 +0000 UTC m=+6821.288655502" watchObservedRunningTime="2025-10-08 20:04:25.392341412 +0000 UTC m=+6821.305312425" Oct 08 20:04:28 crc kubenswrapper[4750]: I1008 20:04:28.735094 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:04:28 crc kubenswrapper[4750]: E1008 20:04:28.736325 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:04:42 crc kubenswrapper[4750]: I1008 20:04:42.742014 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:04:42 crc kubenswrapper[4750]: E1008 20:04:42.743278 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:04:54 crc kubenswrapper[4750]: I1008 20:04:54.744260 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:04:54 crc kubenswrapper[4750]: E1008 20:04:54.746353 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:05:06 crc kubenswrapper[4750]: I1008 20:05:06.734338 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:05:07 crc kubenswrapper[4750]: I1008 20:05:07.904994 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"cbbbf87275f7e0f3294bf94bffab9549261eeb2e7d0d719aab46e1e79b8aa5a9"} Oct 08 20:05:32 crc kubenswrapper[4750]: I1008 20:05:32.427247 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0ec2d588-f810-44a3-a8a9-cb384a2be42d/init-config-reloader/0.log" Oct 08 20:05:32 crc kubenswrapper[4750]: I1008 20:05:32.685913 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0ec2d588-f810-44a3-a8a9-cb384a2be42d/init-config-reloader/0.log" Oct 08 20:05:32 crc kubenswrapper[4750]: I1008 20:05:32.693234 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0ec2d588-f810-44a3-a8a9-cb384a2be42d/alertmanager/0.log" Oct 08 20:05:32 crc kubenswrapper[4750]: I1008 20:05:32.722647 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0ec2d588-f810-44a3-a8a9-cb384a2be42d/config-reloader/0.log" Oct 08 20:05:32 crc kubenswrapper[4750]: I1008 20:05:32.952063 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_100e9e30-b2db-45f0-afe1-785b450f8382/aodh-evaluator/0.log" Oct 08 20:05:32 crc kubenswrapper[4750]: I1008 20:05:32.980409 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_100e9e30-b2db-45f0-afe1-785b450f8382/aodh-api/0.log" Oct 08 20:05:33 crc kubenswrapper[4750]: I1008 20:05:33.168344 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_100e9e30-b2db-45f0-afe1-785b450f8382/aodh-listener/0.log" Oct 08 20:05:33 crc kubenswrapper[4750]: I1008 20:05:33.228392 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_100e9e30-b2db-45f0-afe1-785b450f8382/aodh-notifier/0.log" Oct 08 20:05:33 crc kubenswrapper[4750]: I1008 20:05:33.423501 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-158b-account-create-5l24x_b0ad36c3-6bb5-4024-b62c-cd806b8172d7/mariadb-account-create/0.log" Oct 08 20:05:33 crc kubenswrapper[4750]: I1008 20:05:33.563505 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-create-nq2b9_b7ff6ccf-13de-41ac-af06-668eb1165729/mariadb-database-create/0.log" Oct 08 20:05:33 crc kubenswrapper[4750]: I1008 20:05:33.795825 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-sync-cmw5k_d47a90c8-a408-4b44-a6f5-e65897cce31b/aodh-db-sync/0.log" Oct 08 20:05:34 crc kubenswrapper[4750]: I1008 20:05:34.015773 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56766f6486-vnhn9_f5045303-b61d-4680-bacd-47dad8064038/barbican-api/0.log" Oct 08 20:05:34 crc kubenswrapper[4750]: I1008 20:05:34.076805 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56766f6486-vnhn9_f5045303-b61d-4680-bacd-47dad8064038/barbican-api-log/0.log" Oct 08 20:05:34 crc kubenswrapper[4750]: I1008 20:05:34.236985 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56986ffc7d-j59g6_48760c39-72f8-4889-9075-b2576d3f0209/barbican-keystone-listener/0.log" Oct 08 20:05:34 crc kubenswrapper[4750]: I1008 20:05:34.337905 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56986ffc7d-j59g6_48760c39-72f8-4889-9075-b2576d3f0209/barbican-keystone-listener-log/0.log" Oct 08 20:05:34 crc kubenswrapper[4750]: I1008 20:05:34.510801 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f5fbc99ff-n2gwg_69865ad9-63c1-4f81-9873-4aa359bea376/barbican-worker/0.log" Oct 08 20:05:34 crc kubenswrapper[4750]: I1008 20:05:34.617981 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f5fbc99ff-n2gwg_69865ad9-63c1-4f81-9873-4aa359bea376/barbican-worker-log/0.log" Oct 08 20:05:34 crc kubenswrapper[4750]: I1008 20:05:34.881531 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ee5a4198-42a1-4e36-bbe6-22b923bd2a98/ceilometer-central-agent/0.log" Oct 08 20:05:34 crc kubenswrapper[4750]: I1008 20:05:34.890718 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ee5a4198-42a1-4e36-bbe6-22b923bd2a98/ceilometer-notification-agent/0.log" Oct 08 20:05:35 crc kubenswrapper[4750]: I1008 20:05:35.026863 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ee5a4198-42a1-4e36-bbe6-22b923bd2a98/proxy-httpd/0.log" Oct 08 20:05:35 crc kubenswrapper[4750]: I1008 20:05:35.104587 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ee5a4198-42a1-4e36-bbe6-22b923bd2a98/sg-core/0.log" Oct 08 20:05:35 crc kubenswrapper[4750]: I1008 20:05:35.329443 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_764d6df3-c738-4f73-a3a6-3b502c9052a5/cinder-api/0.log" Oct 08 20:05:35 crc kubenswrapper[4750]: I1008 20:05:35.374875 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_764d6df3-c738-4f73-a3a6-3b502c9052a5/cinder-api-log/0.log" Oct 08 20:05:35 crc kubenswrapper[4750]: I1008 20:05:35.610232 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e7b130b0-4e14-4a52-b944-1788e309b0ce/probe/0.log" Oct 08 20:05:35 crc kubenswrapper[4750]: I1008 20:05:35.733662 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e7b130b0-4e14-4a52-b944-1788e309b0ce/cinder-backup/0.log" Oct 08 20:05:35 crc kubenswrapper[4750]: I1008 20:05:35.826695 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e4f6bc2c-1192-4320-b0fe-97b8853a36b4/cinder-scheduler/0.log" Oct 08 20:05:35 crc kubenswrapper[4750]: I1008 20:05:35.917473 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e4f6bc2c-1192-4320-b0fe-97b8853a36b4/probe/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.105148 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_3f4a0250-42a7-43df-99f6-71bfe6696278/probe/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.126663 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_3f4a0250-42a7-43df-99f6-71bfe6696278/cinder-volume/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.318565 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-ff4557c77-7lrm6_b31f8130-a585-4549-ae4c-1b68c0f8fbe9/init/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.484584 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-ff4557c77-7lrm6_b31f8130-a585-4549-ae4c-1b68c0f8fbe9/init/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.531522 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-ff4557c77-7lrm6_b31f8130-a585-4549-ae4c-1b68c0f8fbe9/dnsmasq-dns/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.619474 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f94e86c5-8e70-4b6c-a8cb-6923e62968b0/glance-httpd/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.717138 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f94e86c5-8e70-4b6c-a8cb-6923e62968b0/glance-log/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.842332 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0362b08c-e18c-45cc-a155-af3775390c3b/glance-httpd/0.log" Oct 08 20:05:36 crc kubenswrapper[4750]: I1008 20:05:36.858528 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0362b08c-e18c-45cc-a155-af3775390c3b/glance-log/0.log" Oct 08 20:05:37 crc kubenswrapper[4750]: I1008 20:05:37.062640 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-4b7c-account-create-l4mdz_97570253-a79c-4ede-a440-3592f37223ee/mariadb-account-create/0.log" Oct 08 20:05:37 crc kubenswrapper[4750]: I1008 20:05:37.114608 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5466485586-4rpcn_f0314e1f-ae9b-40dd-8e2c-672a94f687ee/heat-api/0.log" Oct 08 20:05:37 crc kubenswrapper[4750]: I1008 20:05:37.329062 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6597fb9d78-62bqr_8df319d5-d875-44fe-9f8c-8f53b6129570/heat-cfnapi/0.log" Oct 08 20:05:37 crc kubenswrapper[4750]: I1008 20:05:37.463493 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-create-sxssl_082cee4e-0e68-4a13-86ec-8f2118bb14e3/mariadb-database-create/0.log" Oct 08 20:05:37 crc kubenswrapper[4750]: I1008 20:05:37.611059 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-sync-w4h6x_ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d/heat-db-sync/0.log" Oct 08 20:05:37 crc kubenswrapper[4750]: I1008 20:05:37.784329 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5959c7d876-nv2km_7cee3c3b-138c-4766-b9b4-e7d2b325be0d/heat-engine/0.log" Oct 08 20:05:37 crc kubenswrapper[4750]: I1008 20:05:37.876036 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b7fc4d7fc-bhdm6_9530c511-132e-4075-910e-a8e6606fe282/horizon/0.log" Oct 08 20:05:38 crc kubenswrapper[4750]: I1008 20:05:38.014876 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b7fc4d7fc-bhdm6_9530c511-132e-4075-910e-a8e6606fe282/horizon-log/0.log" Oct 08 20:05:38 crc kubenswrapper[4750]: I1008 20:05:38.264464 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7747bfd995-l7pz4_aa5ad06d-f897-4d65-b17c-c2affae142d6/keystone-api/0.log" Oct 08 20:05:38 crc kubenswrapper[4750]: I1008 20:05:38.286422 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29332561-9dcgq_03421683-cf9d-4dcd-ba62-cbe1da2dca16/keystone-cron/0.log" Oct 08 20:05:38 crc kubenswrapper[4750]: I1008 20:05:38.494779 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6959edb9-fcde-4267-a843-7aa0d1cdff7b/kube-state-metrics/0.log" Oct 08 20:05:38 crc kubenswrapper[4750]: I1008 20:05:38.526497 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-8e2a-account-create-lrccx_bdc6fd03-081e-4966-9c71-c6e937ad4fca/mariadb-account-create/0.log" Oct 08 20:05:38 crc kubenswrapper[4750]: I1008 20:05:38.810067 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_f5fab8c0-47cc-4200-b0fe-215273f5f062/manila-api-log/0.log" Oct 08 20:05:38 crc kubenswrapper[4750]: I1008 20:05:38.900841 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_f5fab8c0-47cc-4200-b0fe-215273f5f062/manila-api/0.log" Oct 08 20:05:39 crc kubenswrapper[4750]: I1008 20:05:39.029798 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-create-7chht_5173b754-515f-46f1-82bb-9376da88bb9b/mariadb-database-create/0.log" Oct 08 20:05:39 crc kubenswrapper[4750]: I1008 20:05:39.162943 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-sync-rslm8_ba387095-0fec-48ef-8bc2-05e8a5368d0b/manila-db-sync/0.log" Oct 08 20:05:39 crc kubenswrapper[4750]: I1008 20:05:39.362661 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_d539a620-0411-4bc6-8fbd-9aa900497424/probe/0.log" Oct 08 20:05:39 crc kubenswrapper[4750]: I1008 20:05:39.391245 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_d539a620-0411-4bc6-8fbd-9aa900497424/manila-scheduler/0.log" Oct 08 20:05:39 crc kubenswrapper[4750]: I1008 20:05:39.584391 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_f751df5d-800a-44c8-989f-75cc7face178/manila-share/0.log" Oct 08 20:05:39 crc kubenswrapper[4750]: I1008 20:05:39.665395 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_f751df5d-800a-44c8-989f-75cc7face178/probe/0.log" Oct 08 20:05:39 crc kubenswrapper[4750]: I1008 20:05:39.711279 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_20764f96-f8a9-499f-9341-941096cf77ce/adoption/0.log" Oct 08 20:05:40 crc kubenswrapper[4750]: I1008 20:05:40.343721 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-548785f96f-fcdrq_3df4a146-e321-4c4b-86f8-dc0a22aeb45b/neutron-api/0.log" Oct 08 20:05:40 crc kubenswrapper[4750]: I1008 20:05:40.527183 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-548785f96f-fcdrq_3df4a146-e321-4c4b-86f8-dc0a22aeb45b/neutron-httpd/0.log" Oct 08 20:05:40 crc kubenswrapper[4750]: I1008 20:05:40.610824 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7454840f-78e3-41f6-a91b-2d34a96d5090/memcached/0.log" Oct 08 20:05:40 crc kubenswrapper[4750]: I1008 20:05:40.721355 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_04268d95-5a3e-416f-b0c4-2b730bbba40f/nova-api-api/0.log" Oct 08 20:05:40 crc kubenswrapper[4750]: I1008 20:05:40.869121 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_04268d95-5a3e-416f-b0c4-2b730bbba40f/nova-api-log/0.log" Oct 08 20:05:40 crc kubenswrapper[4750]: I1008 20:05:40.920817 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9490ebbd-98b4-45dc-9ce1-afd9fc3c179c/nova-cell0-conductor-conductor/0.log" Oct 08 20:05:41 crc kubenswrapper[4750]: I1008 20:05:41.169579 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_41ac8be4-8bbb-4e70-ba9a-4e5a995da828/nova-cell1-conductor-conductor/0.log" Oct 08 20:05:41 crc kubenswrapper[4750]: I1008 20:05:41.229774 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_181c616c-4e29-44e5-bd5b-23754e802000/nova-cell1-novncproxy-novncproxy/0.log" Oct 08 20:05:41 crc kubenswrapper[4750]: I1008 20:05:41.469481 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b/nova-metadata-metadata/0.log" Oct 08 20:05:41 crc kubenswrapper[4750]: I1008 20:05:41.540651 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c14ecf98-f4b1-4a8c-9057-db39c2cbdd5b/nova-metadata-log/0.log" Oct 08 20:05:41 crc kubenswrapper[4750]: I1008 20:05:41.676755 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5f7030ea-8b14-4b45-a722-a49d1eb31294/nova-scheduler-scheduler/0.log" Oct 08 20:05:41 crc kubenswrapper[4750]: I1008 20:05:41.805548 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-54fd9b66fd-67k7j_341cf41b-5181-49d4-a574-f10076a59aa2/init/0.log" Oct 08 20:05:42 crc kubenswrapper[4750]: I1008 20:05:42.077333 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-54fd9b66fd-67k7j_341cf41b-5181-49d4-a574-f10076a59aa2/init/0.log" Oct 08 20:05:42 crc kubenswrapper[4750]: I1008 20:05:42.197406 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-54fd9b66fd-67k7j_341cf41b-5181-49d4-a574-f10076a59aa2/octavia-api/0.log" Oct 08 20:05:42 crc kubenswrapper[4750]: I1008 20:05:42.226944 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-54fd9b66fd-67k7j_341cf41b-5181-49d4-a574-f10076a59aa2/octavia-api-provider-agent/0.log" Oct 08 20:05:42 crc kubenswrapper[4750]: I1008 20:05:42.414614 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-4qbrc_b5222986-32a6-4c3e-97d2-7037c64a08dc/init/0.log" Oct 08 20:05:42 crc kubenswrapper[4750]: I1008 20:05:42.707494 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-4qbrc_b5222986-32a6-4c3e-97d2-7037c64a08dc/octavia-healthmanager/0.log" Oct 08 20:05:42 crc kubenswrapper[4750]: I1008 20:05:42.756799 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ksff9_56ea204e-1ef4-4393-890d-e772748890b3/init/0.log" Oct 08 20:05:42 crc kubenswrapper[4750]: I1008 20:05:42.805131 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-4qbrc_b5222986-32a6-4c3e-97d2-7037c64a08dc/init/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.016883 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ksff9_56ea204e-1ef4-4393-890d-e772748890b3/init/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.049884 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-7ktf9_fd290ec9-232e-4879-a525-01c6bfb72bc8/init/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.123147 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ksff9_56ea204e-1ef4-4393-890d-e772748890b3/octavia-housekeeping/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.286495 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-7ktf9_fd290ec9-232e-4879-a525-01c6bfb72bc8/octavia-amphora-httpd/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.355950 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-7ktf9_fd290ec9-232e-4879-a525-01c6bfb72bc8/init/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.474055 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-tctqf_2f9fbe67-14ed-464f-b087-612a789d7d90/init/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.677310 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-tctqf_2f9fbe67-14ed-464f-b087-612a789d7d90/init/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.770056 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-tctqf_2f9fbe67-14ed-464f-b087-612a789d7d90/octavia-rsyslog/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.812729 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5flzb_8287d9ba-be09-4fc6-8d0d-c2ba26de1279/init/0.log" Oct 08 20:05:43 crc kubenswrapper[4750]: I1008 20:05:43.909984 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5flzb_8287d9ba-be09-4fc6-8d0d-c2ba26de1279/init/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.098105 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_16db7ca3-b357-4651-897e-1329b6b0b4d9/mysql-bootstrap/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.129144 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5flzb_8287d9ba-be09-4fc6-8d0d-c2ba26de1279/octavia-worker/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.246897 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_16db7ca3-b357-4651-897e-1329b6b0b4d9/mysql-bootstrap/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.332185 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_16db7ca3-b357-4651-897e-1329b6b0b4d9/galera/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.374142 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d258d002-5471-49c2-b43b-557992058385/mysql-bootstrap/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.633421 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d258d002-5471-49c2-b43b-557992058385/mysql-bootstrap/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.673065 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fe2f8e4d-30aa-40e6-9b99-d00dc6534dc0/openstackclient/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.692063 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d258d002-5471-49c2-b43b-557992058385/galera/0.log" Oct 08 20:05:44 crc kubenswrapper[4750]: I1008 20:05:44.939513 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ds86v_5cd8ba2c-c86d-48e9-a94d-a6a4555de8c4/openstack-network-exporter/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.028122 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kwprk_9bd109d1-56e5-49b4-884f-2eb99d8a72f9/ovsdb-server-init/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.199304 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kwprk_9bd109d1-56e5-49b4-884f-2eb99d8a72f9/ovsdb-server-init/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.232160 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kwprk_9bd109d1-56e5-49b4-884f-2eb99d8a72f9/ovs-vswitchd/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.269793 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kwprk_9bd109d1-56e5-49b4-884f-2eb99d8a72f9/ovsdb-server/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.394277 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sxhxx_dba47bae-b4f2-411f-9305-9c5e52fc5213/ovn-controller/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.470784 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_eb40b769-eb69-4127-92aa-8520cf6c0883/adoption/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.616370 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_139a64cf-5b9a-40b4-b6b7-ab8132b9a856/openstack-network-exporter/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.719119 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_139a64cf-5b9a-40b4-b6b7-ab8132b9a856/ovn-northd/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.824065 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_95eb24ea-3c42-4690-9623-99af22f79703/openstack-network-exporter/0.log" Oct 08 20:05:45 crc kubenswrapper[4750]: I1008 20:05:45.887841 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_95eb24ea-3c42-4690-9623-99af22f79703/ovsdbserver-nb/0.log" Oct 08 20:05:46 crc kubenswrapper[4750]: I1008 20:05:46.264515 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_4d468db2-b592-4bf9-8cb4-d4fbad07292b/ovsdbserver-nb/0.log" Oct 08 20:05:46 crc kubenswrapper[4750]: I1008 20:05:46.283736 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_4d468db2-b592-4bf9-8cb4-d4fbad07292b/openstack-network-exporter/0.log" Oct 08 20:05:46 crc kubenswrapper[4750]: I1008 20:05:46.488371 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_714778b9-d1d0-4767-a601-5bc178cd3199/openstack-network-exporter/0.log" Oct 08 20:05:46 crc kubenswrapper[4750]: I1008 20:05:46.577773 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_714778b9-d1d0-4767-a601-5bc178cd3199/ovsdbserver-nb/0.log" Oct 08 20:05:46 crc kubenswrapper[4750]: I1008 20:05:46.646304 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_595b9fc0-5a3e-4761-beaa-91924ecf4f54/openstack-network-exporter/0.log" Oct 08 20:05:46 crc kubenswrapper[4750]: I1008 20:05:46.742953 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_595b9fc0-5a3e-4761-beaa-91924ecf4f54/ovsdbserver-sb/0.log" Oct 08 20:05:46 crc kubenswrapper[4750]: I1008 20:05:46.926035 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_d1aa84d3-2d58-482f-8a8c-fe18543714de/openstack-network-exporter/0.log" Oct 08 20:05:46 crc kubenswrapper[4750]: I1008 20:05:46.984104 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_d1aa84d3-2d58-482f-8a8c-fe18543714de/ovsdbserver-sb/0.log" Oct 08 20:05:47 crc kubenswrapper[4750]: I1008 20:05:47.489904 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_28356cd5-1bba-4d6c-8d73-c5650f3daa80/ovsdbserver-sb/0.log" Oct 08 20:05:47 crc kubenswrapper[4750]: I1008 20:05:47.605416 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_28356cd5-1bba-4d6c-8d73-c5650f3daa80/openstack-network-exporter/0.log" Oct 08 20:05:47 crc kubenswrapper[4750]: I1008 20:05:47.746701 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5ffd6988b6-bt9qf_e9bb90a8-f0b9-48aa-94af-133c5ca6a3da/placement-api/0.log" Oct 08 20:05:47 crc kubenswrapper[4750]: I1008 20:05:47.794911 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5ffd6988b6-bt9qf_e9bb90a8-f0b9-48aa-94af-133c5ca6a3da/placement-log/0.log" Oct 08 20:05:47 crc kubenswrapper[4750]: I1008 20:05:47.947984 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_937a384a-f15c-4b65-868a-70d040c830d0/init-config-reloader/0.log" Oct 08 20:05:48 crc kubenswrapper[4750]: I1008 20:05:48.289014 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_937a384a-f15c-4b65-868a-70d040c830d0/init-config-reloader/0.log" Oct 08 20:05:48 crc kubenswrapper[4750]: I1008 20:05:48.298885 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_937a384a-f15c-4b65-868a-70d040c830d0/config-reloader/0.log" Oct 08 20:05:48 crc kubenswrapper[4750]: I1008 20:05:48.321040 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_937a384a-f15c-4b65-868a-70d040c830d0/prometheus/0.log" Oct 08 20:05:48 crc kubenswrapper[4750]: I1008 20:05:48.332848 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_937a384a-f15c-4b65-868a-70d040c830d0/thanos-sidecar/0.log" Oct 08 20:05:48 crc kubenswrapper[4750]: I1008 20:05:48.540679 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3580a244-3fbc-4341-b898-69cd465a21a3/setup-container/0.log" Oct 08 20:05:48 crc kubenswrapper[4750]: I1008 20:05:48.801684 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3580a244-3fbc-4341-b898-69cd465a21a3/setup-container/0.log" Oct 08 20:05:48 crc kubenswrapper[4750]: I1008 20:05:48.821362 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3580a244-3fbc-4341-b898-69cd465a21a3/rabbitmq/0.log" Oct 08 20:05:48 crc kubenswrapper[4750]: I1008 20:05:48.998315 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d34b3ab-fa13-494f-a40c-552c9e3e305b/setup-container/0.log" Oct 08 20:05:49 crc kubenswrapper[4750]: I1008 20:05:49.194833 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d34b3ab-fa13-494f-a40c-552c9e3e305b/setup-container/0.log" Oct 08 20:05:49 crc kubenswrapper[4750]: I1008 20:05:49.253177 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d34b3ab-fa13-494f-a40c-552c9e3e305b/rabbitmq/0.log" Oct 08 20:06:08 crc kubenswrapper[4750]: I1008 20:06:08.043450 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-sxssl"] Oct 08 20:06:08 crc kubenswrapper[4750]: I1008 20:06:08.060008 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-sxssl"] Oct 08 20:06:08 crc kubenswrapper[4750]: I1008 20:06:08.756689 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082cee4e-0e68-4a13-86ec-8f2118bb14e3" path="/var/lib/kubelet/pods/082cee4e-0e68-4a13-86ec-8f2118bb14e3/volumes" Oct 08 20:06:13 crc kubenswrapper[4750]: I1008 20:06:13.324308 4750 scope.go:117] "RemoveContainer" containerID="05c9c0a3127e55228069fb71842668376fdc9e90cc7985cc7dea1aa293866035" Oct 08 20:06:19 crc kubenswrapper[4750]: I1008 20:06:19.056001 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-4b7c-account-create-l4mdz"] Oct 08 20:06:19 crc kubenswrapper[4750]: I1008 20:06:19.070253 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-4b7c-account-create-l4mdz"] Oct 08 20:06:20 crc kubenswrapper[4750]: I1008 20:06:20.755876 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97570253-a79c-4ede-a440-3592f37223ee" path="/var/lib/kubelet/pods/97570253-a79c-4ede-a440-3592f37223ee/volumes" Oct 08 20:06:35 crc kubenswrapper[4750]: I1008 20:06:35.092581 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-w4h6x"] Oct 08 20:06:35 crc kubenswrapper[4750]: I1008 20:06:35.143887 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-w4h6x"] Oct 08 20:06:36 crc kubenswrapper[4750]: I1008 20:06:36.749829 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d" path="/var/lib/kubelet/pods/ea89d0ea-5dcb-45e4-81a5-b7bc2b2c095d/volumes" Oct 08 20:06:43 crc kubenswrapper[4750]: I1008 20:06:43.219778 4750 generic.go:334] "Generic (PLEG): container finished" podID="f6bf367e-25de-45d2-93b9-193b5e86dd86" containerID="7938dff28d88f3e13eb96b129604ef9fac52728305d3e5b6bd9d3e3e15006fdd" exitCode=0 Oct 08 20:06:43 crc kubenswrapper[4750]: I1008 20:06:43.219883 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/crc-debug-8z98f" event={"ID":"f6bf367e-25de-45d2-93b9-193b5e86dd86","Type":"ContainerDied","Data":"7938dff28d88f3e13eb96b129604ef9fac52728305d3e5b6bd9d3e3e15006fdd"} Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.383169 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.446178 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-8z98f"] Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.467995 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-8z98f"] Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.543928 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c9d7\" (UniqueName: \"kubernetes.io/projected/f6bf367e-25de-45d2-93b9-193b5e86dd86-kube-api-access-4c9d7\") pod \"f6bf367e-25de-45d2-93b9-193b5e86dd86\" (UID: \"f6bf367e-25de-45d2-93b9-193b5e86dd86\") " Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.545048 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6bf367e-25de-45d2-93b9-193b5e86dd86-host\") pod \"f6bf367e-25de-45d2-93b9-193b5e86dd86\" (UID: \"f6bf367e-25de-45d2-93b9-193b5e86dd86\") " Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.545189 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6bf367e-25de-45d2-93b9-193b5e86dd86-host" (OuterVolumeSpecName: "host") pod "f6bf367e-25de-45d2-93b9-193b5e86dd86" (UID: "f6bf367e-25de-45d2-93b9-193b5e86dd86"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.546650 4750 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6bf367e-25de-45d2-93b9-193b5e86dd86-host\") on node \"crc\" DevicePath \"\"" Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.565330 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bf367e-25de-45d2-93b9-193b5e86dd86-kube-api-access-4c9d7" (OuterVolumeSpecName: "kube-api-access-4c9d7") pod "f6bf367e-25de-45d2-93b9-193b5e86dd86" (UID: "f6bf367e-25de-45d2-93b9-193b5e86dd86"). InnerVolumeSpecName "kube-api-access-4c9d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.649606 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c9d7\" (UniqueName: \"kubernetes.io/projected/f6bf367e-25de-45d2-93b9-193b5e86dd86-kube-api-access-4c9d7\") on node \"crc\" DevicePath \"\"" Oct 08 20:06:44 crc kubenswrapper[4750]: I1008 20:06:44.752029 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bf367e-25de-45d2-93b9-193b5e86dd86" path="/var/lib/kubelet/pods/f6bf367e-25de-45d2-93b9-193b5e86dd86/volumes" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.248372 4750 scope.go:117] "RemoveContainer" containerID="7938dff28d88f3e13eb96b129604ef9fac52728305d3e5b6bd9d3e3e15006fdd" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.248497 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-8z98f" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.738956 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-9hrh2"] Oct 08 20:06:45 crc kubenswrapper[4750]: E1008 20:06:45.739983 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bf367e-25de-45d2-93b9-193b5e86dd86" containerName="container-00" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.740001 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bf367e-25de-45d2-93b9-193b5e86dd86" containerName="container-00" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.740248 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bf367e-25de-45d2-93b9-193b5e86dd86" containerName="container-00" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.741325 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.784113 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt5bg\" (UniqueName: \"kubernetes.io/projected/40c384a2-809a-4077-af7f-d7bc54e40e3a-kube-api-access-pt5bg\") pod \"crc-debug-9hrh2\" (UID: \"40c384a2-809a-4077-af7f-d7bc54e40e3a\") " pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.784239 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40c384a2-809a-4077-af7f-d7bc54e40e3a-host\") pod \"crc-debug-9hrh2\" (UID: \"40c384a2-809a-4077-af7f-d7bc54e40e3a\") " pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.887491 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40c384a2-809a-4077-af7f-d7bc54e40e3a-host\") pod \"crc-debug-9hrh2\" (UID: \"40c384a2-809a-4077-af7f-d7bc54e40e3a\") " pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.887683 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40c384a2-809a-4077-af7f-d7bc54e40e3a-host\") pod \"crc-debug-9hrh2\" (UID: \"40c384a2-809a-4077-af7f-d7bc54e40e3a\") " pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.887994 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt5bg\" (UniqueName: \"kubernetes.io/projected/40c384a2-809a-4077-af7f-d7bc54e40e3a-kube-api-access-pt5bg\") pod \"crc-debug-9hrh2\" (UID: \"40c384a2-809a-4077-af7f-d7bc54e40e3a\") " pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:45 crc kubenswrapper[4750]: I1008 20:06:45.909480 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt5bg\" (UniqueName: \"kubernetes.io/projected/40c384a2-809a-4077-af7f-d7bc54e40e3a-kube-api-access-pt5bg\") pod \"crc-debug-9hrh2\" (UID: \"40c384a2-809a-4077-af7f-d7bc54e40e3a\") " pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:46 crc kubenswrapper[4750]: I1008 20:06:46.077208 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:46 crc kubenswrapper[4750]: I1008 20:06:46.276152 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" event={"ID":"40c384a2-809a-4077-af7f-d7bc54e40e3a","Type":"ContainerStarted","Data":"abcd0187b855063dfba9daa11385c031994988337a09730f1e0057bdf3bb1463"} Oct 08 20:06:47 crc kubenswrapper[4750]: I1008 20:06:47.296978 4750 generic.go:334] "Generic (PLEG): container finished" podID="40c384a2-809a-4077-af7f-d7bc54e40e3a" containerID="0eb912e399e5a0d0b0edac75450f3d6d9d30648ff18a0b0b60268cc2b69f8d50" exitCode=0 Oct 08 20:06:47 crc kubenswrapper[4750]: I1008 20:06:47.297077 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" event={"ID":"40c384a2-809a-4077-af7f-d7bc54e40e3a","Type":"ContainerDied","Data":"0eb912e399e5a0d0b0edac75450f3d6d9d30648ff18a0b0b60268cc2b69f8d50"} Oct 08 20:06:48 crc kubenswrapper[4750]: I1008 20:06:48.423796 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:48 crc kubenswrapper[4750]: I1008 20:06:48.557138 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40c384a2-809a-4077-af7f-d7bc54e40e3a-host\") pod \"40c384a2-809a-4077-af7f-d7bc54e40e3a\" (UID: \"40c384a2-809a-4077-af7f-d7bc54e40e3a\") " Oct 08 20:06:48 crc kubenswrapper[4750]: I1008 20:06:48.557331 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40c384a2-809a-4077-af7f-d7bc54e40e3a-host" (OuterVolumeSpecName: "host") pod "40c384a2-809a-4077-af7f-d7bc54e40e3a" (UID: "40c384a2-809a-4077-af7f-d7bc54e40e3a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:06:48 crc kubenswrapper[4750]: I1008 20:06:48.557646 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt5bg\" (UniqueName: \"kubernetes.io/projected/40c384a2-809a-4077-af7f-d7bc54e40e3a-kube-api-access-pt5bg\") pod \"40c384a2-809a-4077-af7f-d7bc54e40e3a\" (UID: \"40c384a2-809a-4077-af7f-d7bc54e40e3a\") " Oct 08 20:06:48 crc kubenswrapper[4750]: I1008 20:06:48.558278 4750 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40c384a2-809a-4077-af7f-d7bc54e40e3a-host\") on node \"crc\" DevicePath \"\"" Oct 08 20:06:48 crc kubenswrapper[4750]: I1008 20:06:48.564381 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c384a2-809a-4077-af7f-d7bc54e40e3a-kube-api-access-pt5bg" (OuterVolumeSpecName: "kube-api-access-pt5bg") pod "40c384a2-809a-4077-af7f-d7bc54e40e3a" (UID: "40c384a2-809a-4077-af7f-d7bc54e40e3a"). InnerVolumeSpecName "kube-api-access-pt5bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:06:48 crc kubenswrapper[4750]: I1008 20:06:48.660191 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt5bg\" (UniqueName: \"kubernetes.io/projected/40c384a2-809a-4077-af7f-d7bc54e40e3a-kube-api-access-pt5bg\") on node \"crc\" DevicePath \"\"" Oct 08 20:06:49 crc kubenswrapper[4750]: I1008 20:06:49.321855 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" event={"ID":"40c384a2-809a-4077-af7f-d7bc54e40e3a","Type":"ContainerDied","Data":"abcd0187b855063dfba9daa11385c031994988337a09730f1e0057bdf3bb1463"} Oct 08 20:06:49 crc kubenswrapper[4750]: I1008 20:06:49.322466 4750 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abcd0187b855063dfba9daa11385c031994988337a09730f1e0057bdf3bb1463" Oct 08 20:06:49 crc kubenswrapper[4750]: I1008 20:06:49.321983 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-9hrh2" Oct 08 20:06:54 crc kubenswrapper[4750]: I1008 20:06:54.905908 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rqdgp"] Oct 08 20:06:54 crc kubenswrapper[4750]: E1008 20:06:54.907053 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c384a2-809a-4077-af7f-d7bc54e40e3a" containerName="container-00" Oct 08 20:06:54 crc kubenswrapper[4750]: I1008 20:06:54.907073 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c384a2-809a-4077-af7f-d7bc54e40e3a" containerName="container-00" Oct 08 20:06:54 crc kubenswrapper[4750]: I1008 20:06:54.907344 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c384a2-809a-4077-af7f-d7bc54e40e3a" containerName="container-00" Oct 08 20:06:54 crc kubenswrapper[4750]: I1008 20:06:54.909789 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:54 crc kubenswrapper[4750]: I1008 20:06:54.920867 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqdgp"] Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.023477 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-utilities\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.023524 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd97t\" (UniqueName: \"kubernetes.io/projected/fabbacca-78c0-474f-8716-89e8211f0f0b-kube-api-access-hd97t\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.024033 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-catalog-content\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.126454 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-utilities\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.126502 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd97t\" (UniqueName: \"kubernetes.io/projected/fabbacca-78c0-474f-8716-89e8211f0f0b-kube-api-access-hd97t\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.126594 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-catalog-content\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.127098 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-catalog-content\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.127491 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-utilities\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.165343 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd97t\" (UniqueName: \"kubernetes.io/projected/fabbacca-78c0-474f-8716-89e8211f0f0b-kube-api-access-hd97t\") pod \"redhat-marketplace-rqdgp\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.249177 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:06:55 crc kubenswrapper[4750]: I1008 20:06:55.961799 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqdgp"] Oct 08 20:06:56 crc kubenswrapper[4750]: I1008 20:06:56.408335 4750 generic.go:334] "Generic (PLEG): container finished" podID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerID="217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37" exitCode=0 Oct 08 20:06:56 crc kubenswrapper[4750]: I1008 20:06:56.408699 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqdgp" event={"ID":"fabbacca-78c0-474f-8716-89e8211f0f0b","Type":"ContainerDied","Data":"217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37"} Oct 08 20:06:56 crc kubenswrapper[4750]: I1008 20:06:56.408922 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqdgp" event={"ID":"fabbacca-78c0-474f-8716-89e8211f0f0b","Type":"ContainerStarted","Data":"7f3899404658a2c1583ecac17a61a5d6a77ff4feb2baf3cee7b26eed75b32670"} Oct 08 20:06:56 crc kubenswrapper[4750]: I1008 20:06:56.411978 4750 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 20:06:58 crc kubenswrapper[4750]: I1008 20:06:58.129673 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-9hrh2"] Oct 08 20:06:58 crc kubenswrapper[4750]: I1008 20:06:58.137756 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-9hrh2"] Oct 08 20:06:58 crc kubenswrapper[4750]: I1008 20:06:58.435872 4750 generic.go:334] "Generic (PLEG): container finished" podID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerID="e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e" exitCode=0 Oct 08 20:06:58 crc kubenswrapper[4750]: I1008 20:06:58.435931 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqdgp" event={"ID":"fabbacca-78c0-474f-8716-89e8211f0f0b","Type":"ContainerDied","Data":"e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e"} Oct 08 20:06:58 crc kubenswrapper[4750]: I1008 20:06:58.757410 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c384a2-809a-4077-af7f-d7bc54e40e3a" path="/var/lib/kubelet/pods/40c384a2-809a-4077-af7f-d7bc54e40e3a/volumes" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.326483 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-8smh8"] Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.328880 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.451340 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqdgp" event={"ID":"fabbacca-78c0-474f-8716-89e8211f0f0b","Type":"ContainerStarted","Data":"7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047"} Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.469192 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12092786-d7a6-4c1f-9784-61fa90f0971e-host\") pod \"crc-debug-8smh8\" (UID: \"12092786-d7a6-4c1f-9784-61fa90f0971e\") " pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.469417 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjtzk\" (UniqueName: \"kubernetes.io/projected/12092786-d7a6-4c1f-9784-61fa90f0971e-kube-api-access-zjtzk\") pod \"crc-debug-8smh8\" (UID: \"12092786-d7a6-4c1f-9784-61fa90f0971e\") " pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.485517 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rqdgp" podStartSLOduration=2.970754893 podStartE2EDuration="5.485492303s" podCreationTimestamp="2025-10-08 20:06:54 +0000 UTC" firstStartedPulling="2025-10-08 20:06:56.41174923 +0000 UTC m=+6972.324720243" lastFinishedPulling="2025-10-08 20:06:58.92648665 +0000 UTC m=+6974.839457653" observedRunningTime="2025-10-08 20:06:59.477964055 +0000 UTC m=+6975.390935088" watchObservedRunningTime="2025-10-08 20:06:59.485492303 +0000 UTC m=+6975.398463316" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.571984 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjtzk\" (UniqueName: \"kubernetes.io/projected/12092786-d7a6-4c1f-9784-61fa90f0971e-kube-api-access-zjtzk\") pod \"crc-debug-8smh8\" (UID: \"12092786-d7a6-4c1f-9784-61fa90f0971e\") " pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.572166 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12092786-d7a6-4c1f-9784-61fa90f0971e-host\") pod \"crc-debug-8smh8\" (UID: \"12092786-d7a6-4c1f-9784-61fa90f0971e\") " pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.572350 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12092786-d7a6-4c1f-9784-61fa90f0971e-host\") pod \"crc-debug-8smh8\" (UID: \"12092786-d7a6-4c1f-9784-61fa90f0971e\") " pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.602430 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjtzk\" (UniqueName: \"kubernetes.io/projected/12092786-d7a6-4c1f-9784-61fa90f0971e-kube-api-access-zjtzk\") pod \"crc-debug-8smh8\" (UID: \"12092786-d7a6-4c1f-9784-61fa90f0971e\") " pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:06:59 crc kubenswrapper[4750]: I1008 20:06:59.692269 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:06:59 crc kubenswrapper[4750]: W1008 20:06:59.725267 4750 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12092786_d7a6_4c1f_9784_61fa90f0971e.slice/crio-dbb3a99e28f6b2c4ab048ee0aa8cd3f365f22d773d3d60e3f3651dcb2f900855 WatchSource:0}: Error finding container dbb3a99e28f6b2c4ab048ee0aa8cd3f365f22d773d3d60e3f3651dcb2f900855: Status 404 returned error can't find the container with id dbb3a99e28f6b2c4ab048ee0aa8cd3f365f22d773d3d60e3f3651dcb2f900855 Oct 08 20:07:00 crc kubenswrapper[4750]: I1008 20:07:00.462563 4750 generic.go:334] "Generic (PLEG): container finished" podID="12092786-d7a6-4c1f-9784-61fa90f0971e" containerID="2a79929131708f99771d2a11ab681c805be9b2dd1eddb16ba0e93a4c823cb696" exitCode=0 Oct 08 20:07:00 crc kubenswrapper[4750]: I1008 20:07:00.462609 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/crc-debug-8smh8" event={"ID":"12092786-d7a6-4c1f-9784-61fa90f0971e","Type":"ContainerDied","Data":"2a79929131708f99771d2a11ab681c805be9b2dd1eddb16ba0e93a4c823cb696"} Oct 08 20:07:00 crc kubenswrapper[4750]: I1008 20:07:00.463087 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/crc-debug-8smh8" event={"ID":"12092786-d7a6-4c1f-9784-61fa90f0971e","Type":"ContainerStarted","Data":"dbb3a99e28f6b2c4ab048ee0aa8cd3f365f22d773d3d60e3f3651dcb2f900855"} Oct 08 20:07:00 crc kubenswrapper[4750]: I1008 20:07:00.514599 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-8smh8"] Oct 08 20:07:00 crc kubenswrapper[4750]: I1008 20:07:00.524781 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7b8x6/crc-debug-8smh8"] Oct 08 20:07:01 crc kubenswrapper[4750]: I1008 20:07:01.608067 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:07:01 crc kubenswrapper[4750]: I1008 20:07:01.731268 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjtzk\" (UniqueName: \"kubernetes.io/projected/12092786-d7a6-4c1f-9784-61fa90f0971e-kube-api-access-zjtzk\") pod \"12092786-d7a6-4c1f-9784-61fa90f0971e\" (UID: \"12092786-d7a6-4c1f-9784-61fa90f0971e\") " Oct 08 20:07:01 crc kubenswrapper[4750]: I1008 20:07:01.731362 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12092786-d7a6-4c1f-9784-61fa90f0971e-host\") pod \"12092786-d7a6-4c1f-9784-61fa90f0971e\" (UID: \"12092786-d7a6-4c1f-9784-61fa90f0971e\") " Oct 08 20:07:01 crc kubenswrapper[4750]: I1008 20:07:01.731542 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12092786-d7a6-4c1f-9784-61fa90f0971e-host" (OuterVolumeSpecName: "host") pod "12092786-d7a6-4c1f-9784-61fa90f0971e" (UID: "12092786-d7a6-4c1f-9784-61fa90f0971e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 20:07:01 crc kubenswrapper[4750]: I1008 20:07:01.732249 4750 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12092786-d7a6-4c1f-9784-61fa90f0971e-host\") on node \"crc\" DevicePath \"\"" Oct 08 20:07:01 crc kubenswrapper[4750]: I1008 20:07:01.742889 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12092786-d7a6-4c1f-9784-61fa90f0971e-kube-api-access-zjtzk" (OuterVolumeSpecName: "kube-api-access-zjtzk") pod "12092786-d7a6-4c1f-9784-61fa90f0971e" (UID: "12092786-d7a6-4c1f-9784-61fa90f0971e"). InnerVolumeSpecName "kube-api-access-zjtzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:07:01 crc kubenswrapper[4750]: I1008 20:07:01.835979 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjtzk\" (UniqueName: \"kubernetes.io/projected/12092786-d7a6-4c1f-9784-61fa90f0971e-kube-api-access-zjtzk\") on node \"crc\" DevicePath \"\"" Oct 08 20:07:02 crc kubenswrapper[4750]: I1008 20:07:02.486320 4750 scope.go:117] "RemoveContainer" containerID="2a79929131708f99771d2a11ab681c805be9b2dd1eddb16ba0e93a4c823cb696" Oct 08 20:07:02 crc kubenswrapper[4750]: I1008 20:07:02.486482 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/crc-debug-8smh8" Oct 08 20:07:02 crc kubenswrapper[4750]: I1008 20:07:02.531907 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl_f62fe45d-4c16-42a3-8445-075a733b6a13/util/0.log" Oct 08 20:07:02 crc kubenswrapper[4750]: I1008 20:07:02.749811 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12092786-d7a6-4c1f-9784-61fa90f0971e" path="/var/lib/kubelet/pods/12092786-d7a6-4c1f-9784-61fa90f0971e/volumes" Oct 08 20:07:02 crc kubenswrapper[4750]: I1008 20:07:02.812736 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl_f62fe45d-4c16-42a3-8445-075a733b6a13/util/0.log" Oct 08 20:07:02 crc kubenswrapper[4750]: I1008 20:07:02.838905 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl_f62fe45d-4c16-42a3-8445-075a733b6a13/pull/0.log" Oct 08 20:07:02 crc kubenswrapper[4750]: I1008 20:07:02.839088 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl_f62fe45d-4c16-42a3-8445-075a733b6a13/pull/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.074352 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl_f62fe45d-4c16-42a3-8445-075a733b6a13/util/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.137723 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl_f62fe45d-4c16-42a3-8445-075a733b6a13/extract/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.159998 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1bfea55cf6f540853d0c0d919aed51b5fd6e2d105cc44a68cb64399f8bnvgdl_f62fe45d-4c16-42a3-8445-075a733b6a13/pull/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.331652 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-vwnz5_b22d34d7-5452-4982-96ff-100a0fbdf514/kube-rbac-proxy/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.377850 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-k8v2z_07dc6c54-47cf-48e9-aed2-3016f69594de/kube-rbac-proxy/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.418953 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-vwnz5_b22d34d7-5452-4982-96ff-100a0fbdf514/manager/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.631386 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-2flss_b41e76ff-471c-4874-88ab-66b6fae3a84a/manager/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.639472 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-2flss_b41e76ff-471c-4874-88ab-66b6fae3a84a/kube-rbac-proxy/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.639643 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-k8v2z_07dc6c54-47cf-48e9-aed2-3016f69594de/manager/0.log" Oct 08 20:07:03 crc kubenswrapper[4750]: I1008 20:07:03.899985 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-pcb44_3e9cf3d8-0e7a-40ce-a32e-25e8700d8307/kube-rbac-proxy/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.033271 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-pcb44_3e9cf3d8-0e7a-40ce-a32e-25e8700d8307/manager/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.106586 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-km9s8_3e418980-7ef9-45e8-8d7d-5bd866af0d26/kube-rbac-proxy/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.170449 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-km9s8_3e418980-7ef9-45e8-8d7d-5bd866af0d26/manager/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.250032 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-nm5np_6b0569ca-c2be-4209-bce5-79feea80203c/kube-rbac-proxy/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.354893 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-nm5np_6b0569ca-c2be-4209-bce5-79feea80203c/manager/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.372270 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-drhrw_716ce9a4-ee82-45d0-bb63-fab1eed3c1db/kube-rbac-proxy/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.567211 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-s454n_47b84278-c660-43eb-8c7d-686ceed80afd/kube-rbac-proxy/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.758894 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-s454n_47b84278-c660-43eb-8c7d-686ceed80afd/manager/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.841019 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-drhrw_716ce9a4-ee82-45d0-bb63-fab1eed3c1db/manager/0.log" Oct 08 20:07:04 crc kubenswrapper[4750]: I1008 20:07:04.959060 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-f857v_6bc7e757-dd1c-4334-b83e-c4bc73e96658/kube-rbac-proxy/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.104856 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-prrpn_1f01efe4-3233-44d1-8def-b5080cc50aac/kube-rbac-proxy/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.113718 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-f857v_6bc7e757-dd1c-4334-b83e-c4bc73e96658/manager/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.245669 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-prrpn_1f01efe4-3233-44d1-8def-b5080cc50aac/manager/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.249726 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.249803 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.314828 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.344849 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-k6wbx_e03ff9b9-1896-4759-8bf2-66cb9e1a5a55/kube-rbac-proxy/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.373152 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-k6wbx_e03ff9b9-1896-4759-8bf2-66cb9e1a5a55/manager/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.573543 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-9wgqj_280ba6f4-ec1f-4afe-8a1a-db5f030495ae/kube-rbac-proxy/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.587506 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.650374 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-9wgqj_280ba6f4-ec1f-4afe-8a1a-db5f030495ae/manager/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.651934 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqdgp"] Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.694578 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-w4sqr_c53530b4-353e-4c78-87ad-baacd725dc79/kube-rbac-proxy/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.921500 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-xhnm8_909207ba-552c-4172-94c1-2e660e86e568/kube-rbac-proxy/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.947801 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-xhnm8_909207ba-552c-4172-94c1-2e660e86e568/manager/0.log" Oct 08 20:07:05 crc kubenswrapper[4750]: I1008 20:07:05.996512 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-w4sqr_c53530b4-353e-4c78-87ad-baacd725dc79/manager/0.log" Oct 08 20:07:06 crc kubenswrapper[4750]: I1008 20:07:06.163509 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg_7bda6fa0-aa4a-4e4b-9165-c50b976bdce7/kube-rbac-proxy/0.log" Oct 08 20:07:06 crc kubenswrapper[4750]: I1008 20:07:06.196407 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-677c5f5bff5bxpg_7bda6fa0-aa4a-4e4b-9165-c50b976bdce7/manager/0.log" Oct 08 20:07:06 crc kubenswrapper[4750]: I1008 20:07:06.596980 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-598c4c5b5-s82d2_6115ecee-36ba-4281-afa3-21685170393b/kube-rbac-proxy/0.log" Oct 08 20:07:06 crc kubenswrapper[4750]: I1008 20:07:06.861799 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-bd6bc67fb-f8sjn_f5fcb06d-f818-4f9f-abe4-0f272bdf2681/kube-rbac-proxy/0.log" Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.054763 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-bd6bc67fb-f8sjn_f5fcb06d-f818-4f9f-abe4-0f272bdf2681/operator/0.log" Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.190968 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79db49b9fb-bs9x4_9db814e8-3488-4ae4-ad24-b6f93f70c232/kube-rbac-proxy/0.log" Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.204601 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wvq8j_0a3ef1a8-10f5-4292-84de-04bc8c2cff44/registry-server/0.log" Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.418388 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79db49b9fb-bs9x4_9db814e8-3488-4ae4-ad24-b6f93f70c232/manager/0.log" Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.464984 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-kn2bm_9323399b-7f22-41aa-9db7-99772d8d6719/kube-rbac-proxy/0.log" Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.572527 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rqdgp" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerName="registry-server" containerID="cri-o://7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047" gracePeriod=2 Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.631451 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-kn2bm_9323399b-7f22-41aa-9db7-99772d8d6719/manager/0.log" Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.768342 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-tcnvk_bc8e10c4-5871-4909-af47-4b2c1c78d3be/operator/0.log" Oct 08 20:07:07 crc kubenswrapper[4750]: I1008 20:07:07.991769 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-qnpbb_18f1f857-f60c-4c47-b5c9-d60207e60ef5/kube-rbac-proxy/0.log" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.067142 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76796d4c6b-vrwkf_2239d744-45d5-40ac-b30a-929c4edd8489/kube-rbac-proxy/0.log" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.161278 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.260683 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-qnpbb_18f1f857-f60c-4c47-b5c9-d60207e60ef5/manager/0.log" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.323888 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd97t\" (UniqueName: \"kubernetes.io/projected/fabbacca-78c0-474f-8716-89e8211f0f0b-kube-api-access-hd97t\") pod \"fabbacca-78c0-474f-8716-89e8211f0f0b\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.323962 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-utilities\") pod \"fabbacca-78c0-474f-8716-89e8211f0f0b\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.324011 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-catalog-content\") pod \"fabbacca-78c0-474f-8716-89e8211f0f0b\" (UID: \"fabbacca-78c0-474f-8716-89e8211f0f0b\") " Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.325797 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-utilities" (OuterVolumeSpecName: "utilities") pod "fabbacca-78c0-474f-8716-89e8211f0f0b" (UID: "fabbacca-78c0-474f-8716-89e8211f0f0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.343760 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fabbacca-78c0-474f-8716-89e8211f0f0b" (UID: "fabbacca-78c0-474f-8716-89e8211f0f0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.347821 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fabbacca-78c0-474f-8716-89e8211f0f0b-kube-api-access-hd97t" (OuterVolumeSpecName: "kube-api-access-hd97t") pod "fabbacca-78c0-474f-8716-89e8211f0f0b" (UID: "fabbacca-78c0-474f-8716-89e8211f0f0b"). InnerVolumeSpecName "kube-api-access-hd97t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.426531 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd97t\" (UniqueName: \"kubernetes.io/projected/fabbacca-78c0-474f-8716-89e8211f0f0b-kube-api-access-hd97t\") on node \"crc\" DevicePath \"\"" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.426583 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.426594 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbacca-78c0-474f-8716-89e8211f0f0b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.523210 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56c698c775-jsq4j_9b3555bd-2455-4844-a0a7-16ddfca85ce5/kube-rbac-proxy/0.log" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.558629 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56c698c775-jsq4j_9b3555bd-2455-4844-a0a7-16ddfca85ce5/manager/0.log" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.564881 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76796d4c6b-vrwkf_2239d744-45d5-40ac-b30a-929c4edd8489/manager/0.log" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.584764 4750 generic.go:334] "Generic (PLEG): container finished" podID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerID="7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047" exitCode=0 Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.584817 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqdgp" event={"ID":"fabbacca-78c0-474f-8716-89e8211f0f0b","Type":"ContainerDied","Data":"7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047"} Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.584852 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqdgp" event={"ID":"fabbacca-78c0-474f-8716-89e8211f0f0b","Type":"ContainerDied","Data":"7f3899404658a2c1583ecac17a61a5d6a77ff4feb2baf3cee7b26eed75b32670"} Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.584872 4750 scope.go:117] "RemoveContainer" containerID="7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.585034 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rqdgp" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.585495 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-598c4c5b5-s82d2_6115ecee-36ba-4281-afa3-21685170393b/manager/0.log" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.621663 4750 scope.go:117] "RemoveContainer" containerID="e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.626744 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqdgp"] Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.654418 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqdgp"] Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.659707 4750 scope.go:117] "RemoveContainer" containerID="217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.709853 4750 scope.go:117] "RemoveContainer" containerID="7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047" Oct 08 20:07:08 crc kubenswrapper[4750]: E1008 20:07:08.710363 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047\": container with ID starting with 7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047 not found: ID does not exist" containerID="7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.710422 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047"} err="failed to get container status \"7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047\": rpc error: code = NotFound desc = could not find container \"7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047\": container with ID starting with 7f53413e467dcaf048f6e362129d73ce3b2a166b9177c41f80c84d7d08fad047 not found: ID does not exist" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.710457 4750 scope.go:117] "RemoveContainer" containerID="e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e" Oct 08 20:07:08 crc kubenswrapper[4750]: E1008 20:07:08.710817 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e\": container with ID starting with e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e not found: ID does not exist" containerID="e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.710844 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e"} err="failed to get container status \"e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e\": rpc error: code = NotFound desc = could not find container \"e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e\": container with ID starting with e8e557200f4d0b756afae57e4cb8cae2628df0f3381c0038509cf6552cbf7f8e not found: ID does not exist" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.710861 4750 scope.go:117] "RemoveContainer" containerID="217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37" Oct 08 20:07:08 crc kubenswrapper[4750]: E1008 20:07:08.711171 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37\": container with ID starting with 217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37 not found: ID does not exist" containerID="217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.711192 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37"} err="failed to get container status \"217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37\": rpc error: code = NotFound desc = could not find container \"217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37\": container with ID starting with 217299addbab444046b11c1a167f29ca8c0fc647a12af13282ff526d9ca11b37 not found: ID does not exist" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.750321 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" path="/var/lib/kubelet/pods/fabbacca-78c0-474f-8716-89e8211f0f0b/volumes" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.808221 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7794bc6bd-7876q_96e0828f-b805-4aa0-b0fc-e64c3aeb6120/manager/0.log" Oct 08 20:07:08 crc kubenswrapper[4750]: I1008 20:07:08.836139 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7794bc6bd-7876q_96e0828f-b805-4aa0-b0fc-e64c3aeb6120/kube-rbac-proxy/0.log" Oct 08 20:07:13 crc kubenswrapper[4750]: I1008 20:07:13.434576 4750 scope.go:117] "RemoveContainer" containerID="792c40fe07c7f54b1d2f97afe714e3259c9c192ae8e79c4c74ec9bc817c49abe" Oct 08 20:07:13 crc kubenswrapper[4750]: I1008 20:07:13.466267 4750 scope.go:117] "RemoveContainer" containerID="3b1b97643fc8f00f936334ec7feec6c068e4eceb47fa4e69552de4fcbed60918" Oct 08 20:07:28 crc kubenswrapper[4750]: I1008 20:07:28.263381 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fsv6k_55eee756-293d-4ca8-91d0-6f95f22e72dc/control-plane-machine-set-operator/0.log" Oct 08 20:07:28 crc kubenswrapper[4750]: I1008 20:07:28.500340 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4r4tz_a20bcaf3-28dc-41f5-8c3e-25b3dfad8202/kube-rbac-proxy/0.log" Oct 08 20:07:28 crc kubenswrapper[4750]: I1008 20:07:28.519630 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4r4tz_a20bcaf3-28dc-41f5-8c3e-25b3dfad8202/machine-api-operator/0.log" Oct 08 20:07:29 crc kubenswrapper[4750]: I1008 20:07:29.707501 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:07:29 crc kubenswrapper[4750]: I1008 20:07:29.707964 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:07:42 crc kubenswrapper[4750]: I1008 20:07:42.917481 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-g5pnf_94b58e9d-c56f-4476-9734-9a12e840e26b/cert-manager-controller/0.log" Oct 08 20:07:43 crc kubenswrapper[4750]: I1008 20:07:43.130396 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-wg2g2_8177296f-6138-431e-808b-7f4b643a2a90/cert-manager-cainjector/0.log" Oct 08 20:07:43 crc kubenswrapper[4750]: I1008 20:07:43.198964 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-zjr7p_a29a394f-e073-44c0-a666-9370883743bf/cert-manager-webhook/0.log" Oct 08 20:07:58 crc kubenswrapper[4750]: I1008 20:07:58.635751 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-v69zk_2cb56acc-cb75-4140-82db-412d149f4ba0/nmstate-console-plugin/0.log" Oct 08 20:07:58 crc kubenswrapper[4750]: I1008 20:07:58.837527 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zgswq_83a1d5d8-1631-4df6-8b0e-a48e52fd4972/nmstate-handler/0.log" Oct 08 20:07:58 crc kubenswrapper[4750]: I1008 20:07:58.942604 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-8j7zl_eb4084db-bdd2-4c22-9eaa-0919b73e0977/kube-rbac-proxy/0.log" Oct 08 20:07:58 crc kubenswrapper[4750]: I1008 20:07:58.991595 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-8j7zl_eb4084db-bdd2-4c22-9eaa-0919b73e0977/nmstate-metrics/0.log" Oct 08 20:07:59 crc kubenswrapper[4750]: I1008 20:07:59.116420 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-bjcrl_2915c99e-fa94-482d-ab1d-c506bc61b0ed/nmstate-operator/0.log" Oct 08 20:07:59 crc kubenswrapper[4750]: I1008 20:07:59.233921 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-pzxbn_157527e2-af03-406e-8757-abd19563422d/nmstate-webhook/0.log" Oct 08 20:07:59 crc kubenswrapper[4750]: I1008 20:07:59.707374 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:07:59 crc kubenswrapper[4750]: I1008 20:07:59.707436 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:08:15 crc kubenswrapper[4750]: I1008 20:08:15.177042 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-8dgbs_9094b73f-9ae7-436d-85cd-4d3feade13ae/kube-rbac-proxy/0.log" Oct 08 20:08:15 crc kubenswrapper[4750]: I1008 20:08:15.404766 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-frr-files/0.log" Oct 08 20:08:15 crc kubenswrapper[4750]: I1008 20:08:15.711170 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-8dgbs_9094b73f-9ae7-436d-85cd-4d3feade13ae/controller/0.log" Oct 08 20:08:15 crc kubenswrapper[4750]: I1008 20:08:15.744838 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-frr-files/0.log" Oct 08 20:08:15 crc kubenswrapper[4750]: I1008 20:08:15.744846 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-reloader/0.log" Oct 08 20:08:15 crc kubenswrapper[4750]: I1008 20:08:15.788396 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-metrics/0.log" Oct 08 20:08:15 crc kubenswrapper[4750]: I1008 20:08:15.887360 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-reloader/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.135447 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-frr-files/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.138678 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-metrics/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.143306 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-metrics/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.186326 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-reloader/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.390887 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-reloader/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.391156 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-frr-files/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.399443 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/cp-metrics/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.422892 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/controller/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.594503 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/kube-rbac-proxy/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.610351 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rp56q"] Oct 08 20:08:16 crc kubenswrapper[4750]: E1008 20:08:16.611024 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12092786-d7a6-4c1f-9784-61fa90f0971e" containerName="container-00" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.611046 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="12092786-d7a6-4c1f-9784-61fa90f0971e" containerName="container-00" Oct 08 20:08:16 crc kubenswrapper[4750]: E1008 20:08:16.611077 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerName="extract-content" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.611084 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerName="extract-content" Oct 08 20:08:16 crc kubenswrapper[4750]: E1008 20:08:16.611110 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerName="extract-utilities" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.611117 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerName="extract-utilities" Oct 08 20:08:16 crc kubenswrapper[4750]: E1008 20:08:16.611130 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerName="registry-server" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.611136 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerName="registry-server" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.611398 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabbacca-78c0-474f-8716-89e8211f0f0b" containerName="registry-server" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.611427 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="12092786-d7a6-4c1f-9784-61fa90f0971e" containerName="container-00" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.613585 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.622069 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rp56q"] Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.646613 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-catalog-content\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.646735 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l5tn\" (UniqueName: \"kubernetes.io/projected/8dc40677-14fe-4f80-b36d-2e5e9af343de-kube-api-access-2l5tn\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.646759 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-utilities\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.671020 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/frr-metrics/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.688032 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/kube-rbac-proxy-frr/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.749234 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-catalog-content\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.749360 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l5tn\" (UniqueName: \"kubernetes.io/projected/8dc40677-14fe-4f80-b36d-2e5e9af343de-kube-api-access-2l5tn\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.749394 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-utilities\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.749897 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-utilities\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.750198 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-catalog-content\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.801341 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l5tn\" (UniqueName: \"kubernetes.io/projected/8dc40677-14fe-4f80-b36d-2e5e9af343de-kube-api-access-2l5tn\") pod \"community-operators-rp56q\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.938312 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/reloader/0.log" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.957831 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:16 crc kubenswrapper[4750]: I1008 20:08:16.984000 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-xwpph_3f040b00-d031-47f3-be72-8bdece6ddf78/frr-k8s-webhook-server/0.log" Oct 08 20:08:17 crc kubenswrapper[4750]: I1008 20:08:17.291738 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7767688d85-nqdwk_9b578cc7-f656-4f3c-ae80-a54d325a597e/manager/0.log" Oct 08 20:08:17 crc kubenswrapper[4750]: I1008 20:08:17.521196 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66f8f8cbb5-65c29_02480923-67a5-453b-a317-2230fa281c71/webhook-server/0.log" Oct 08 20:08:17 crc kubenswrapper[4750]: I1008 20:08:17.546996 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rp56q"] Oct 08 20:08:17 crc kubenswrapper[4750]: I1008 20:08:17.633259 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-sqjmd_85a087dd-81ec-4b22-bed2-e64d25106913/kube-rbac-proxy/0.log" Oct 08 20:08:18 crc kubenswrapper[4750]: I1008 20:08:18.445934 4750 generic.go:334] "Generic (PLEG): container finished" podID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerID="01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46" exitCode=0 Oct 08 20:08:18 crc kubenswrapper[4750]: I1008 20:08:18.446000 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp56q" event={"ID":"8dc40677-14fe-4f80-b36d-2e5e9af343de","Type":"ContainerDied","Data":"01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46"} Oct 08 20:08:18 crc kubenswrapper[4750]: I1008 20:08:18.446033 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp56q" event={"ID":"8dc40677-14fe-4f80-b36d-2e5e9af343de","Type":"ContainerStarted","Data":"4e4f9f6fa133b8a33e8366e8db19917c129859b5a6cd5e00e9351e1acf026bf7"} Oct 08 20:08:18 crc kubenswrapper[4750]: I1008 20:08:18.688154 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-sqjmd_85a087dd-81ec-4b22-bed2-e64d25106913/speaker/0.log" Oct 08 20:08:19 crc kubenswrapper[4750]: I1008 20:08:19.656925 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4z9bv_c10359b5-cc1f-4ac5-a360-4cf1a6e7bec1/frr/0.log" Oct 08 20:08:20 crc kubenswrapper[4750]: I1008 20:08:20.492942 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp56q" event={"ID":"8dc40677-14fe-4f80-b36d-2e5e9af343de","Type":"ContainerStarted","Data":"087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067"} Oct 08 20:08:21 crc kubenswrapper[4750]: I1008 20:08:21.516241 4750 generic.go:334] "Generic (PLEG): container finished" podID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerID="087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067" exitCode=0 Oct 08 20:08:21 crc kubenswrapper[4750]: I1008 20:08:21.516351 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp56q" event={"ID":"8dc40677-14fe-4f80-b36d-2e5e9af343de","Type":"ContainerDied","Data":"087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067"} Oct 08 20:08:22 crc kubenswrapper[4750]: I1008 20:08:22.531243 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp56q" event={"ID":"8dc40677-14fe-4f80-b36d-2e5e9af343de","Type":"ContainerStarted","Data":"9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2"} Oct 08 20:08:22 crc kubenswrapper[4750]: I1008 20:08:22.557036 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rp56q" podStartSLOduration=3.024340429 podStartE2EDuration="6.557018996s" podCreationTimestamp="2025-10-08 20:08:16 +0000 UTC" firstStartedPulling="2025-10-08 20:08:18.452375285 +0000 UTC m=+7054.365346288" lastFinishedPulling="2025-10-08 20:08:21.985053842 +0000 UTC m=+7057.898024855" observedRunningTime="2025-10-08 20:08:22.555041127 +0000 UTC m=+7058.468012170" watchObservedRunningTime="2025-10-08 20:08:22.557018996 +0000 UTC m=+7058.469990009" Oct 08 20:08:26 crc kubenswrapper[4750]: I1008 20:08:26.959200 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:26 crc kubenswrapper[4750]: I1008 20:08:26.960835 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:27 crc kubenswrapper[4750]: I1008 20:08:27.018912 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:27 crc kubenswrapper[4750]: I1008 20:08:27.665240 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:27 crc kubenswrapper[4750]: I1008 20:08:27.730240 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rp56q"] Oct 08 20:08:29 crc kubenswrapper[4750]: I1008 20:08:29.629006 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rp56q" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerName="registry-server" containerID="cri-o://9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2" gracePeriod=2 Oct 08 20:08:29 crc kubenswrapper[4750]: I1008 20:08:29.706907 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:08:29 crc kubenswrapper[4750]: I1008 20:08:29.707440 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:08:29 crc kubenswrapper[4750]: I1008 20:08:29.707514 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 20:08:29 crc kubenswrapper[4750]: I1008 20:08:29.709081 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbbbf87275f7e0f3294bf94bffab9549261eeb2e7d0d719aab46e1e79b8aa5a9"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 20:08:29 crc kubenswrapper[4750]: I1008 20:08:29.709204 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://cbbbf87275f7e0f3294bf94bffab9549261eeb2e7d0d719aab46e1e79b8aa5a9" gracePeriod=600 Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.210323 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.254574 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-utilities\") pod \"8dc40677-14fe-4f80-b36d-2e5e9af343de\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.254761 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l5tn\" (UniqueName: \"kubernetes.io/projected/8dc40677-14fe-4f80-b36d-2e5e9af343de-kube-api-access-2l5tn\") pod \"8dc40677-14fe-4f80-b36d-2e5e9af343de\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.254826 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-catalog-content\") pod \"8dc40677-14fe-4f80-b36d-2e5e9af343de\" (UID: \"8dc40677-14fe-4f80-b36d-2e5e9af343de\") " Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.257157 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-utilities" (OuterVolumeSpecName: "utilities") pod "8dc40677-14fe-4f80-b36d-2e5e9af343de" (UID: "8dc40677-14fe-4f80-b36d-2e5e9af343de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.265974 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc40677-14fe-4f80-b36d-2e5e9af343de-kube-api-access-2l5tn" (OuterVolumeSpecName: "kube-api-access-2l5tn") pod "8dc40677-14fe-4f80-b36d-2e5e9af343de" (UID: "8dc40677-14fe-4f80-b36d-2e5e9af343de"). InnerVolumeSpecName "kube-api-access-2l5tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.338879 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8dc40677-14fe-4f80-b36d-2e5e9af343de" (UID: "8dc40677-14fe-4f80-b36d-2e5e9af343de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.360193 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.360242 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l5tn\" (UniqueName: \"kubernetes.io/projected/8dc40677-14fe-4f80-b36d-2e5e9af343de-kube-api-access-2l5tn\") on node \"crc\" DevicePath \"\"" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.360259 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc40677-14fe-4f80-b36d-2e5e9af343de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.642635 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="cbbbf87275f7e0f3294bf94bffab9549261eeb2e7d0d719aab46e1e79b8aa5a9" exitCode=0 Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.642716 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"cbbbf87275f7e0f3294bf94bffab9549261eeb2e7d0d719aab46e1e79b8aa5a9"} Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.642808 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerStarted","Data":"97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141"} Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.642895 4750 scope.go:117] "RemoveContainer" containerID="166b04b6c456f116d6b504a63f511a318627a6cec196a1300ad0582a72b88cf9" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.645775 4750 generic.go:334] "Generic (PLEG): container finished" podID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerID="9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2" exitCode=0 Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.645829 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp56q" event={"ID":"8dc40677-14fe-4f80-b36d-2e5e9af343de","Type":"ContainerDied","Data":"9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2"} Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.645864 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rp56q" event={"ID":"8dc40677-14fe-4f80-b36d-2e5e9af343de","Type":"ContainerDied","Data":"4e4f9f6fa133b8a33e8366e8db19917c129859b5a6cd5e00e9351e1acf026bf7"} Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.645945 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rp56q" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.686753 4750 scope.go:117] "RemoveContainer" containerID="9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.701453 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rp56q"] Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.714532 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rp56q"] Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.717691 4750 scope.go:117] "RemoveContainer" containerID="087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.745820 4750 scope.go:117] "RemoveContainer" containerID="01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.769123 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" path="/var/lib/kubelet/pods/8dc40677-14fe-4f80-b36d-2e5e9af343de/volumes" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.811404 4750 scope.go:117] "RemoveContainer" containerID="9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2" Oct 08 20:08:30 crc kubenswrapper[4750]: E1008 20:08:30.812118 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2\": container with ID starting with 9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2 not found: ID does not exist" containerID="9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.812237 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2"} err="failed to get container status \"9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2\": rpc error: code = NotFound desc = could not find container \"9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2\": container with ID starting with 9629a9b62a00d5736c8bfaa33b5e70aacb5b5a2c49cb443b92e8f13c808672e2 not found: ID does not exist" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.812330 4750 scope.go:117] "RemoveContainer" containerID="087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067" Oct 08 20:08:30 crc kubenswrapper[4750]: E1008 20:08:30.812651 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067\": container with ID starting with 087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067 not found: ID does not exist" containerID="087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.812748 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067"} err="failed to get container status \"087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067\": rpc error: code = NotFound desc = could not find container \"087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067\": container with ID starting with 087f59f098cbd5d14e35d4201825ad69958f1d03420ff7557de4dc086a0c3067 not found: ID does not exist" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.812816 4750 scope.go:117] "RemoveContainer" containerID="01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46" Oct 08 20:08:30 crc kubenswrapper[4750]: E1008 20:08:30.813229 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46\": container with ID starting with 01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46 not found: ID does not exist" containerID="01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46" Oct 08 20:08:30 crc kubenswrapper[4750]: I1008 20:08:30.813290 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46"} err="failed to get container status \"01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46\": rpc error: code = NotFound desc = could not find container \"01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46\": container with ID starting with 01ca43a09aa76ad3ac06af4e308150bd7614bd4c283e07f3f2778bce7ef5aa46 not found: ID does not exist" Oct 08 20:08:32 crc kubenswrapper[4750]: I1008 20:08:32.789179 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw_3911a0af-c9a8-477b-b9d1-77fac1ed4441/util/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.021417 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw_3911a0af-c9a8-477b-b9d1-77fac1ed4441/util/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.026955 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw_3911a0af-c9a8-477b-b9d1-77fac1ed4441/pull/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.112221 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw_3911a0af-c9a8-477b-b9d1-77fac1ed4441/pull/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.224283 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw_3911a0af-c9a8-477b-b9d1-77fac1ed4441/util/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.274579 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw_3911a0af-c9a8-477b-b9d1-77fac1ed4441/pull/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.295721 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69qhhgw_3911a0af-c9a8-477b-b9d1-77fac1ed4441/extract/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.447453 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94_99f391ca-4dc5-402b-ab7c-916433ee0b9e/util/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.622694 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94_99f391ca-4dc5-402b-ab7c-916433ee0b9e/util/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.659413 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94_99f391ca-4dc5-402b-ab7c-916433ee0b9e/pull/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.659482 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94_99f391ca-4dc5-402b-ab7c-916433ee0b9e/pull/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.836843 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94_99f391ca-4dc5-402b-ab7c-916433ee0b9e/util/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.840510 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94_99f391ca-4dc5-402b-ab7c-916433ee0b9e/pull/0.log" Oct 08 20:08:33 crc kubenswrapper[4750]: I1008 20:08:33.885534 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zsm94_99f391ca-4dc5-402b-ab7c-916433ee0b9e/extract/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.040383 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d_0ac80049-fa73-452c-975e-35d0b0a121de/util/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.248734 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d_0ac80049-fa73-452c-975e-35d0b0a121de/pull/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.264101 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d_0ac80049-fa73-452c-975e-35d0b0a121de/pull/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.285578 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d_0ac80049-fa73-452c-975e-35d0b0a121de/util/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.455797 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d_0ac80049-fa73-452c-975e-35d0b0a121de/util/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.477263 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d_0ac80049-fa73-452c-975e-35d0b0a121de/extract/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.515006 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drhk4d_0ac80049-fa73-452c-975e-35d0b0a121de/pull/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.651074 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnrjm_56032d2a-4cde-430a-a40e-ab1eed32b651/extract-utilities/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.964650 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnrjm_56032d2a-4cde-430a-a40e-ab1eed32b651/extract-content/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.967891 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnrjm_56032d2a-4cde-430a-a40e-ab1eed32b651/extract-content/0.log" Oct 08 20:08:34 crc kubenswrapper[4750]: I1008 20:08:34.974848 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnrjm_56032d2a-4cde-430a-a40e-ab1eed32b651/extract-utilities/0.log" Oct 08 20:08:35 crc kubenswrapper[4750]: I1008 20:08:35.196889 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnrjm_56032d2a-4cde-430a-a40e-ab1eed32b651/extract-utilities/0.log" Oct 08 20:08:35 crc kubenswrapper[4750]: I1008 20:08:35.282112 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnrjm_56032d2a-4cde-430a-a40e-ab1eed32b651/extract-content/0.log" Oct 08 20:08:35 crc kubenswrapper[4750]: I1008 20:08:35.517909 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn7zk_0d789a75-07c0-4e1d-889b-8d2221b1ff95/extract-utilities/0.log" Oct 08 20:08:35 crc kubenswrapper[4750]: I1008 20:08:35.783452 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn7zk_0d789a75-07c0-4e1d-889b-8d2221b1ff95/extract-utilities/0.log" Oct 08 20:08:35 crc kubenswrapper[4750]: I1008 20:08:35.788987 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn7zk_0d789a75-07c0-4e1d-889b-8d2221b1ff95/extract-content/0.log" Oct 08 20:08:35 crc kubenswrapper[4750]: I1008 20:08:35.812900 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn7zk_0d789a75-07c0-4e1d-889b-8d2221b1ff95/extract-content/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.023744 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn7zk_0d789a75-07c0-4e1d-889b-8d2221b1ff95/extract-utilities/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.110829 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn7zk_0d789a75-07c0-4e1d-889b-8d2221b1ff95/extract-content/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.364152 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9_0073f23d-f9bf-4c55-b2d1-8106cde0bf97/util/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.438492 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnrjm_56032d2a-4cde-430a-a40e-ab1eed32b651/registry-server/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.661035 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9_0073f23d-f9bf-4c55-b2d1-8106cde0bf97/pull/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.664740 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9_0073f23d-f9bf-4c55-b2d1-8106cde0bf97/util/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.685398 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9_0073f23d-f9bf-4c55-b2d1-8106cde0bf97/pull/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.924399 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9_0073f23d-f9bf-4c55-b2d1-8106cde0bf97/pull/0.log" Oct 08 20:08:36 crc kubenswrapper[4750]: I1008 20:08:36.995132 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9_0073f23d-f9bf-4c55-b2d1-8106cde0bf97/extract/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.002904 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835chghm9_0073f23d-f9bf-4c55-b2d1-8106cde0bf97/util/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.224168 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9trpl_e3a62523-66bd-4167-890d-2cddc40f8695/extract-utilities/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.225070 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xspt4_e5cb56e3-bb34-427d-b4c1-7dec95f40023/marketplace-operator/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.398343 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vn7zk_0d789a75-07c0-4e1d-889b-8d2221b1ff95/registry-server/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.485635 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9trpl_e3a62523-66bd-4167-890d-2cddc40f8695/extract-utilities/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.511470 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9trpl_e3a62523-66bd-4167-890d-2cddc40f8695/extract-content/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.540316 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9trpl_e3a62523-66bd-4167-890d-2cddc40f8695/extract-content/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.707304 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9trpl_e3a62523-66bd-4167-890d-2cddc40f8695/extract-utilities/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.707304 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9trpl_e3a62523-66bd-4167-890d-2cddc40f8695/extract-content/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.765501 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v4wpt_19c82dda-1fe3-489d-a6bb-28bd646fc3ad/extract-utilities/0.log" Oct 08 20:08:37 crc kubenswrapper[4750]: I1008 20:08:37.964241 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v4wpt_19c82dda-1fe3-489d-a6bb-28bd646fc3ad/extract-utilities/0.log" Oct 08 20:08:38 crc kubenswrapper[4750]: I1008 20:08:38.016412 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v4wpt_19c82dda-1fe3-489d-a6bb-28bd646fc3ad/extract-content/0.log" Oct 08 20:08:38 crc kubenswrapper[4750]: I1008 20:08:38.023645 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v4wpt_19c82dda-1fe3-489d-a6bb-28bd646fc3ad/extract-content/0.log" Oct 08 20:08:38 crc kubenswrapper[4750]: I1008 20:08:38.042146 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9trpl_e3a62523-66bd-4167-890d-2cddc40f8695/registry-server/0.log" Oct 08 20:08:38 crc kubenswrapper[4750]: I1008 20:08:38.250902 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v4wpt_19c82dda-1fe3-489d-a6bb-28bd646fc3ad/extract-content/0.log" Oct 08 20:08:38 crc kubenswrapper[4750]: I1008 20:08:38.282060 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v4wpt_19c82dda-1fe3-489d-a6bb-28bd646fc3ad/extract-utilities/0.log" Oct 08 20:08:39 crc kubenswrapper[4750]: I1008 20:08:39.113765 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v4wpt_19c82dda-1fe3-489d-a6bb-28bd646fc3ad/registry-server/0.log" Oct 08 20:08:52 crc kubenswrapper[4750]: I1008 20:08:52.984150 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-5xxjb_32704cbb-ea75-4dc3-96b1-718b994fe335/prometheus-operator/0.log" Oct 08 20:08:53 crc kubenswrapper[4750]: I1008 20:08:53.048036 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-nq2b9"] Oct 08 20:08:53 crc kubenswrapper[4750]: I1008 20:08:53.059523 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-nq2b9"] Oct 08 20:08:53 crc kubenswrapper[4750]: I1008 20:08:53.198400 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f4865c8dd-vnthj_38b5c4ed-86e9-4be1-8df3-be9cd23ffa4f/prometheus-operator-admission-webhook/0.log" Oct 08 20:08:53 crc kubenswrapper[4750]: I1008 20:08:53.234010 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f4865c8dd-hzw7x_dc3a1db5-d8d5-4138-966c-93257a1f27f7/prometheus-operator-admission-webhook/0.log" Oct 08 20:08:53 crc kubenswrapper[4750]: I1008 20:08:53.414950 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-h6jpz_34a0461d-6888-438f-bca8-95ffbb37e14e/perses-operator/0.log" Oct 08 20:08:53 crc kubenswrapper[4750]: I1008 20:08:53.443801 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-6sxl4_55701760-1c94-48e7-a3b8-42571d13ac31/operator/0.log" Oct 08 20:08:54 crc kubenswrapper[4750]: I1008 20:08:54.750793 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7ff6ccf-13de-41ac-af06-668eb1165729" path="/var/lib/kubelet/pods/b7ff6ccf-13de-41ac-af06-668eb1165729/volumes" Oct 08 20:09:03 crc kubenswrapper[4750]: I1008 20:09:03.075262 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-158b-account-create-5l24x"] Oct 08 20:09:03 crc kubenswrapper[4750]: I1008 20:09:03.094412 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-158b-account-create-5l24x"] Oct 08 20:09:04 crc kubenswrapper[4750]: I1008 20:09:04.761595 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ad36c3-6bb5-4024-b62c-cd806b8172d7" path="/var/lib/kubelet/pods/b0ad36c3-6bb5-4024-b62c-cd806b8172d7/volumes" Oct 08 20:09:07 crc kubenswrapper[4750]: E1008 20:09:07.741803 4750 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.75:39000->38.102.83.75:34167: read tcp 38.102.83.75:39000->38.102.83.75:34167: read: connection reset by peer Oct 08 20:09:13 crc kubenswrapper[4750]: I1008 20:09:13.680660 4750 scope.go:117] "RemoveContainer" containerID="b81afe9219ebe3a68379d648c7b046be7468f4c8e9b6852c620e3700c2a26e49" Oct 08 20:09:13 crc kubenswrapper[4750]: I1008 20:09:13.710188 4750 scope.go:117] "RemoveContainer" containerID="d6eecc244c9abd051f75fe36a9f49fb502f126ab11552c2fe7b0f1806646ddae" Oct 08 20:09:15 crc kubenswrapper[4750]: I1008 20:09:15.032364 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-cmw5k"] Oct 08 20:09:15 crc kubenswrapper[4750]: I1008 20:09:15.047947 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-cmw5k"] Oct 08 20:09:16 crc kubenswrapper[4750]: I1008 20:09:16.754756 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d47a90c8-a408-4b44-a6f5-e65897cce31b" path="/var/lib/kubelet/pods/d47a90c8-a408-4b44-a6f5-e65897cce31b/volumes" Oct 08 20:09:38 crc kubenswrapper[4750]: I1008 20:09:38.053816 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-7chht"] Oct 08 20:09:38 crc kubenswrapper[4750]: I1008 20:09:38.073518 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-7chht"] Oct 08 20:09:38 crc kubenswrapper[4750]: I1008 20:09:38.760022 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5173b754-515f-46f1-82bb-9376da88bb9b" path="/var/lib/kubelet/pods/5173b754-515f-46f1-82bb-9376da88bb9b/volumes" Oct 08 20:09:48 crc kubenswrapper[4750]: I1008 20:09:48.040664 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-8e2a-account-create-lrccx"] Oct 08 20:09:48 crc kubenswrapper[4750]: I1008 20:09:48.053933 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-8e2a-account-create-lrccx"] Oct 08 20:09:48 crc kubenswrapper[4750]: I1008 20:09:48.751534 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc6fd03-081e-4966-9c71-c6e937ad4fca" path="/var/lib/kubelet/pods/bdc6fd03-081e-4966-9c71-c6e937ad4fca/volumes" Oct 08 20:10:00 crc kubenswrapper[4750]: I1008 20:10:00.030352 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-rslm8"] Oct 08 20:10:00 crc kubenswrapper[4750]: I1008 20:10:00.038118 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-rslm8"] Oct 08 20:10:00 crc kubenswrapper[4750]: I1008 20:10:00.750354 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba387095-0fec-48ef-8bc2-05e8a5368d0b" path="/var/lib/kubelet/pods/ba387095-0fec-48ef-8bc2-05e8a5368d0b/volumes" Oct 08 20:10:13 crc kubenswrapper[4750]: I1008 20:10:13.879610 4750 scope.go:117] "RemoveContainer" containerID="3ce0b45918529d74153ef2a496bf4808f6ac6063700592aa4a2398c3c8f08a0e" Oct 08 20:10:13 crc kubenswrapper[4750]: I1008 20:10:13.935663 4750 scope.go:117] "RemoveContainer" containerID="0c1dc69f437d5597a09943d41a48cb08b27e352bcae2b7a5be9b32c77e61092b" Oct 08 20:10:13 crc kubenswrapper[4750]: I1008 20:10:13.962909 4750 scope.go:117] "RemoveContainer" containerID="173478bda2acf0ec781d4ccb33aacf7e141a6cad10975743093c0bf1188e6298" Oct 08 20:10:14 crc kubenswrapper[4750]: I1008 20:10:14.033504 4750 scope.go:117] "RemoveContainer" containerID="c28ba5ec12915cff3189165cc711584c0f28c1c7de3a0b38af7f95c2442926d0" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.127779 4750 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2sqjd"] Oct 08 20:10:44 crc kubenswrapper[4750]: E1008 20:10:44.131171 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerName="extract-content" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.131313 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerName="extract-content" Oct 08 20:10:44 crc kubenswrapper[4750]: E1008 20:10:44.131485 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerName="extract-utilities" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.131608 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerName="extract-utilities" Oct 08 20:10:44 crc kubenswrapper[4750]: E1008 20:10:44.131737 4750 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerName="registry-server" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.131850 4750 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerName="registry-server" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.132345 4750 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc40677-14fe-4f80-b36d-2e5e9af343de" containerName="registry-server" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.135370 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.184067 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sqjd"] Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.242887 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-utilities\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.243399 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-catalog-content\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.243835 4750 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5m2\" (UniqueName: \"kubernetes.io/projected/790e921c-bf6e-4efe-bddf-d63dfc62c676-kube-api-access-cf5m2\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.348232 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-catalog-content\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.348734 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-catalog-content\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.348727 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5m2\" (UniqueName: \"kubernetes.io/projected/790e921c-bf6e-4efe-bddf-d63dfc62c676-kube-api-access-cf5m2\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.349606 4750 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-utilities\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.349907 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-utilities\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.382665 4750 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5m2\" (UniqueName: \"kubernetes.io/projected/790e921c-bf6e-4efe-bddf-d63dfc62c676-kube-api-access-cf5m2\") pod \"redhat-operators-2sqjd\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:44 crc kubenswrapper[4750]: I1008 20:10:44.464754 4750 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:45 crc kubenswrapper[4750]: I1008 20:10:45.059191 4750 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sqjd"] Oct 08 20:10:45 crc kubenswrapper[4750]: I1008 20:10:45.492854 4750 generic.go:334] "Generic (PLEG): container finished" podID="790e921c-bf6e-4efe-bddf-d63dfc62c676" containerID="532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038" exitCode=0 Oct 08 20:10:45 crc kubenswrapper[4750]: I1008 20:10:45.492973 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sqjd" event={"ID":"790e921c-bf6e-4efe-bddf-d63dfc62c676","Type":"ContainerDied","Data":"532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038"} Oct 08 20:10:45 crc kubenswrapper[4750]: I1008 20:10:45.498834 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sqjd" event={"ID":"790e921c-bf6e-4efe-bddf-d63dfc62c676","Type":"ContainerStarted","Data":"9200dacb28e57528f501e826b360fd4aaa920e5265aeec5b37500144da4dc288"} Oct 08 20:10:47 crc kubenswrapper[4750]: I1008 20:10:47.528407 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sqjd" event={"ID":"790e921c-bf6e-4efe-bddf-d63dfc62c676","Type":"ContainerStarted","Data":"15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed"} Oct 08 20:10:49 crc kubenswrapper[4750]: I1008 20:10:49.557193 4750 generic.go:334] "Generic (PLEG): container finished" podID="790e921c-bf6e-4efe-bddf-d63dfc62c676" containerID="15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed" exitCode=0 Oct 08 20:10:49 crc kubenswrapper[4750]: I1008 20:10:49.557390 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sqjd" event={"ID":"790e921c-bf6e-4efe-bddf-d63dfc62c676","Type":"ContainerDied","Data":"15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed"} Oct 08 20:10:50 crc kubenswrapper[4750]: I1008 20:10:50.577221 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sqjd" event={"ID":"790e921c-bf6e-4efe-bddf-d63dfc62c676","Type":"ContainerStarted","Data":"871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f"} Oct 08 20:10:50 crc kubenswrapper[4750]: I1008 20:10:50.609208 4750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2sqjd" podStartSLOduration=2.051968224 podStartE2EDuration="6.60917978s" podCreationTimestamp="2025-10-08 20:10:44 +0000 UTC" firstStartedPulling="2025-10-08 20:10:45.495856532 +0000 UTC m=+7201.408827555" lastFinishedPulling="2025-10-08 20:10:50.053068098 +0000 UTC m=+7205.966039111" observedRunningTime="2025-10-08 20:10:50.607718093 +0000 UTC m=+7206.520689136" watchObservedRunningTime="2025-10-08 20:10:50.60917978 +0000 UTC m=+7206.522150793" Oct 08 20:10:54 crc kubenswrapper[4750]: I1008 20:10:54.465064 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:54 crc kubenswrapper[4750]: I1008 20:10:54.465875 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:10:55 crc kubenswrapper[4750]: I1008 20:10:55.549274 4750 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2sqjd" podUID="790e921c-bf6e-4efe-bddf-d63dfc62c676" containerName="registry-server" probeResult="failure" output=< Oct 08 20:10:55 crc kubenswrapper[4750]: timeout: failed to connect service ":50051" within 1s Oct 08 20:10:55 crc kubenswrapper[4750]: > Oct 08 20:10:59 crc kubenswrapper[4750]: I1008 20:10:59.706940 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:10:59 crc kubenswrapper[4750]: I1008 20:10:59.707374 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:11:00 crc kubenswrapper[4750]: I1008 20:11:00.690776 4750 generic.go:334] "Generic (PLEG): container finished" podID="63e5a76a-91eb-4254-a425-e63278761e8f" containerID="6856bfc51be89118b3c27c6cdde39f205e687f54c29ac12dc06635a6498f67bf" exitCode=0 Oct 08 20:11:00 crc kubenswrapper[4750]: I1008 20:11:00.690898 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7b8x6/must-gather-2jh45" event={"ID":"63e5a76a-91eb-4254-a425-e63278761e8f","Type":"ContainerDied","Data":"6856bfc51be89118b3c27c6cdde39f205e687f54c29ac12dc06635a6498f67bf"} Oct 08 20:11:00 crc kubenswrapper[4750]: I1008 20:11:00.693762 4750 scope.go:117] "RemoveContainer" containerID="6856bfc51be89118b3c27c6cdde39f205e687f54c29ac12dc06635a6498f67bf" Oct 08 20:11:01 crc kubenswrapper[4750]: I1008 20:11:01.701811 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7b8x6_must-gather-2jh45_63e5a76a-91eb-4254-a425-e63278761e8f/gather/0.log" Oct 08 20:11:04 crc kubenswrapper[4750]: I1008 20:11:04.531192 4750 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:11:04 crc kubenswrapper[4750]: I1008 20:11:04.602263 4750 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:11:04 crc kubenswrapper[4750]: I1008 20:11:04.779305 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2sqjd"] Oct 08 20:11:05 crc kubenswrapper[4750]: I1008 20:11:05.743628 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2sqjd" podUID="790e921c-bf6e-4efe-bddf-d63dfc62c676" containerName="registry-server" containerID="cri-o://871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f" gracePeriod=2 Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.262129 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.300147 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf5m2\" (UniqueName: \"kubernetes.io/projected/790e921c-bf6e-4efe-bddf-d63dfc62c676-kube-api-access-cf5m2\") pod \"790e921c-bf6e-4efe-bddf-d63dfc62c676\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.300420 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-utilities\") pod \"790e921c-bf6e-4efe-bddf-d63dfc62c676\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.300466 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-catalog-content\") pod \"790e921c-bf6e-4efe-bddf-d63dfc62c676\" (UID: \"790e921c-bf6e-4efe-bddf-d63dfc62c676\") " Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.301976 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-utilities" (OuterVolumeSpecName: "utilities") pod "790e921c-bf6e-4efe-bddf-d63dfc62c676" (UID: "790e921c-bf6e-4efe-bddf-d63dfc62c676"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.309095 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790e921c-bf6e-4efe-bddf-d63dfc62c676-kube-api-access-cf5m2" (OuterVolumeSpecName: "kube-api-access-cf5m2") pod "790e921c-bf6e-4efe-bddf-d63dfc62c676" (UID: "790e921c-bf6e-4efe-bddf-d63dfc62c676"). InnerVolumeSpecName "kube-api-access-cf5m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.384372 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "790e921c-bf6e-4efe-bddf-d63dfc62c676" (UID: "790e921c-bf6e-4efe-bddf-d63dfc62c676"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.404200 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf5m2\" (UniqueName: \"kubernetes.io/projected/790e921c-bf6e-4efe-bddf-d63dfc62c676-kube-api-access-cf5m2\") on node \"crc\" DevicePath \"\"" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.404231 4750 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.404243 4750 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790e921c-bf6e-4efe-bddf-d63dfc62c676-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.770902 4750 generic.go:334] "Generic (PLEG): container finished" podID="790e921c-bf6e-4efe-bddf-d63dfc62c676" containerID="871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f" exitCode=0 Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.770994 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sqjd" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.770975 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sqjd" event={"ID":"790e921c-bf6e-4efe-bddf-d63dfc62c676","Type":"ContainerDied","Data":"871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f"} Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.771507 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sqjd" event={"ID":"790e921c-bf6e-4efe-bddf-d63dfc62c676","Type":"ContainerDied","Data":"9200dacb28e57528f501e826b360fd4aaa920e5265aeec5b37500144da4dc288"} Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.771564 4750 scope.go:117] "RemoveContainer" containerID="871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.814061 4750 scope.go:117] "RemoveContainer" containerID="15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.823004 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2sqjd"] Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.835540 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2sqjd"] Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.846400 4750 scope.go:117] "RemoveContainer" containerID="532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.913140 4750 scope.go:117] "RemoveContainer" containerID="871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f" Oct 08 20:11:06 crc kubenswrapper[4750]: E1008 20:11:06.913925 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f\": container with ID starting with 871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f not found: ID does not exist" containerID="871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.914000 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f"} err="failed to get container status \"871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f\": rpc error: code = NotFound desc = could not find container \"871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f\": container with ID starting with 871553ec5a1dffae07d1f30203c4302dfc65b483bd24801231a2a5f759ca6f2f not found: ID does not exist" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.914045 4750 scope.go:117] "RemoveContainer" containerID="15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed" Oct 08 20:11:06 crc kubenswrapper[4750]: E1008 20:11:06.914645 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed\": container with ID starting with 15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed not found: ID does not exist" containerID="15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.914686 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed"} err="failed to get container status \"15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed\": rpc error: code = NotFound desc = could not find container \"15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed\": container with ID starting with 15135ae85b8c71485bdef8bfae17606438089f3409cf92b98754ba0efc8337ed not found: ID does not exist" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.914715 4750 scope.go:117] "RemoveContainer" containerID="532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038" Oct 08 20:11:06 crc kubenswrapper[4750]: E1008 20:11:06.915375 4750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038\": container with ID starting with 532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038 not found: ID does not exist" containerID="532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038" Oct 08 20:11:06 crc kubenswrapper[4750]: I1008 20:11:06.915407 4750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038"} err="failed to get container status \"532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038\": rpc error: code = NotFound desc = could not find container \"532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038\": container with ID starting with 532c0f35f700aec1adc21b709830218904a3543bce909036fa13f3e3fae88038 not found: ID does not exist" Oct 08 20:11:08 crc kubenswrapper[4750]: I1008 20:11:08.752485 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790e921c-bf6e-4efe-bddf-d63dfc62c676" path="/var/lib/kubelet/pods/790e921c-bf6e-4efe-bddf-d63dfc62c676/volumes" Oct 08 20:11:10 crc kubenswrapper[4750]: I1008 20:11:10.628199 4750 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7b8x6/must-gather-2jh45"] Oct 08 20:11:10 crc kubenswrapper[4750]: I1008 20:11:10.629038 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7b8x6/must-gather-2jh45" podUID="63e5a76a-91eb-4254-a425-e63278761e8f" containerName="copy" containerID="cri-o://0b3969d71faf0c111aadeb208dcb5596af4fa914b806234105ee4892b1f79bf2" gracePeriod=2 Oct 08 20:11:10 crc kubenswrapper[4750]: I1008 20:11:10.638708 4750 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7b8x6/must-gather-2jh45"] Oct 08 20:11:10 crc kubenswrapper[4750]: I1008 20:11:10.824166 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7b8x6_must-gather-2jh45_63e5a76a-91eb-4254-a425-e63278761e8f/copy/0.log" Oct 08 20:11:10 crc kubenswrapper[4750]: I1008 20:11:10.824693 4750 generic.go:334] "Generic (PLEG): container finished" podID="63e5a76a-91eb-4254-a425-e63278761e8f" containerID="0b3969d71faf0c111aadeb208dcb5596af4fa914b806234105ee4892b1f79bf2" exitCode=143 Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.185415 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7b8x6_must-gather-2jh45_63e5a76a-91eb-4254-a425-e63278761e8f/copy/0.log" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.186269 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.239874 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63e5a76a-91eb-4254-a425-e63278761e8f-must-gather-output\") pod \"63e5a76a-91eb-4254-a425-e63278761e8f\" (UID: \"63e5a76a-91eb-4254-a425-e63278761e8f\") " Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.239938 4750 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9qbp\" (UniqueName: \"kubernetes.io/projected/63e5a76a-91eb-4254-a425-e63278761e8f-kube-api-access-d9qbp\") pod \"63e5a76a-91eb-4254-a425-e63278761e8f\" (UID: \"63e5a76a-91eb-4254-a425-e63278761e8f\") " Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.251821 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e5a76a-91eb-4254-a425-e63278761e8f-kube-api-access-d9qbp" (OuterVolumeSpecName: "kube-api-access-d9qbp") pod "63e5a76a-91eb-4254-a425-e63278761e8f" (UID: "63e5a76a-91eb-4254-a425-e63278761e8f"). InnerVolumeSpecName "kube-api-access-d9qbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.342621 4750 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9qbp\" (UniqueName: \"kubernetes.io/projected/63e5a76a-91eb-4254-a425-e63278761e8f-kube-api-access-d9qbp\") on node \"crc\" DevicePath \"\"" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.422070 4750 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e5a76a-91eb-4254-a425-e63278761e8f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "63e5a76a-91eb-4254-a425-e63278761e8f" (UID: "63e5a76a-91eb-4254-a425-e63278761e8f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.444931 4750 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63e5a76a-91eb-4254-a425-e63278761e8f-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.835227 4750 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7b8x6_must-gather-2jh45_63e5a76a-91eb-4254-a425-e63278761e8f/copy/0.log" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.836046 4750 scope.go:117] "RemoveContainer" containerID="0b3969d71faf0c111aadeb208dcb5596af4fa914b806234105ee4892b1f79bf2" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.836055 4750 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7b8x6/must-gather-2jh45" Oct 08 20:11:11 crc kubenswrapper[4750]: I1008 20:11:11.870530 4750 scope.go:117] "RemoveContainer" containerID="6856bfc51be89118b3c27c6cdde39f205e687f54c29ac12dc06635a6498f67bf" Oct 08 20:11:12 crc kubenswrapper[4750]: I1008 20:11:12.748455 4750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e5a76a-91eb-4254-a425-e63278761e8f" path="/var/lib/kubelet/pods/63e5a76a-91eb-4254-a425-e63278761e8f/volumes" Oct 08 20:11:29 crc kubenswrapper[4750]: I1008 20:11:29.706904 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:11:29 crc kubenswrapper[4750]: I1008 20:11:29.707707 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:11:59 crc kubenswrapper[4750]: I1008 20:11:59.707209 4750 patch_prober.go:28] interesting pod/machine-config-daemon-grddb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 20:11:59 crc kubenswrapper[4750]: I1008 20:11:59.707971 4750 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 20:11:59 crc kubenswrapper[4750]: I1008 20:11:59.708050 4750 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-grddb" Oct 08 20:11:59 crc kubenswrapper[4750]: I1008 20:11:59.709613 4750 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141"} pod="openshift-machine-config-operator/machine-config-daemon-grddb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 20:11:59 crc kubenswrapper[4750]: I1008 20:11:59.709725 4750 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerName="machine-config-daemon" containerID="cri-o://97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141" gracePeriod=600 Oct 08 20:11:59 crc kubenswrapper[4750]: E1008 20:11:59.840775 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:12:00 crc kubenswrapper[4750]: I1008 20:12:00.423336 4750 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" containerID="97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141" exitCode=0 Oct 08 20:12:00 crc kubenswrapper[4750]: I1008 20:12:00.423420 4750 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-grddb" event={"ID":"f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4","Type":"ContainerDied","Data":"97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141"} Oct 08 20:12:00 crc kubenswrapper[4750]: I1008 20:12:00.423735 4750 scope.go:117] "RemoveContainer" containerID="cbbbf87275f7e0f3294bf94bffab9549261eeb2e7d0d719aab46e1e79b8aa5a9" Oct 08 20:12:00 crc kubenswrapper[4750]: I1008 20:12:00.424379 4750 scope.go:117] "RemoveContainer" containerID="97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141" Oct 08 20:12:00 crc kubenswrapper[4750]: E1008 20:12:00.424818 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:12:14 crc kubenswrapper[4750]: I1008 20:12:14.744293 4750 scope.go:117] "RemoveContainer" containerID="97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141" Oct 08 20:12:14 crc kubenswrapper[4750]: E1008 20:12:14.745439 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:12:26 crc kubenswrapper[4750]: I1008 20:12:26.734655 4750 scope.go:117] "RemoveContainer" containerID="97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141" Oct 08 20:12:26 crc kubenswrapper[4750]: E1008 20:12:26.735536 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4" Oct 08 20:12:39 crc kubenswrapper[4750]: I1008 20:12:39.734960 4750 scope.go:117] "RemoveContainer" containerID="97b77d20e48ddad1309a20d7312b352f1c79d1e7dda8a71e4178d6c004ba3141" Oct 08 20:12:39 crc kubenswrapper[4750]: E1008 20:12:39.736969 4750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-grddb_openshift-machine-config-operator(f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4)\"" pod="openshift-machine-config-operator/machine-config-daemon-grddb" podUID="f0c2e8b2-f31b-4d3e-83a9-b7e490cf83b4"